69 results
Search Results
2. Basic color categories and Mandarin Chinese color terms.
- Author
-
Sun, Vincent C. and Chen, Chien-Chung
- Subjects
COLOR vision ,CHINESE color prints ,COLORED paper ,TOUCH screens ,MULTIPLE correspondence analysis (Statistics) - Abstract
Basic color terms used in Mandarin Chinese have been controversial since first discussed by Berlin and Kay in 1969. Previous studies showed much inconsistency on what should be considered as basic color terms in Mandarin Chinese. In the present study, we investigated categories of color rather than merely the color terms used by Taiwanese native Mandarin speakers. Using samples conforming to the Berlin and Kay survey, various colors were chosen from a collection of Natural Color System (NCS) colored papers and mounted on a piece of neutral gray card. The card was then mounted on a touch-screen, under D65 illumination. Thirty-two single-character color related Mandarin terms were selected from a Chinese character database according to frequency of use. Participants were required to select the color sample that matched the term by pressing a virtual button on the touch screen. The results show that certain terms can be directly correlated to basic color terms in English, comparable with the results of Berlin and Kay’s original study and those that followed. However, some terms, such as Mo (墨 ink), Tie (鐵 iron), and Cai (菜vegetable), show a wide spread of term maps and inconsistent use among subjects. Principle component analysis (PCA) procedures were used to analysis the commodity of data among subjects. The findings suggest that the basic color categories among Mandarin Chinese speakers are similar to those found in the World Color Survey (WCS), but are represented by wide-spread and inconsistent color terms among speakers. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
3. A two-stage filter for removing salt-and-pepper noise using noise detector based on characteristic difference parameter and adaptive directional mean filter
- Author
-
Yufeng Nie and Hongjin Ma
- Subjects
Computer and Information Sciences ,Imaging Techniques ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Equipment ,Color ,lcsh:Medicine ,Transportation ,Image Analysis ,02 engineering and technology ,Digital Imaging ,Research and Analysis Methods ,Infographics ,Mathematical and Statistical Techniques ,Filter Paper ,Image Interpretation, Computer-Assisted ,Computer Science::Multimedia ,Photography ,0202 electrical engineering, electronic engineering, information engineering ,Median filter ,lcsh:Science ,Multidisciplinary ,Pixel ,Data Visualization ,Applied Mathematics ,Simulation and Modeling ,Detector ,lcsh:R ,Digital imaging ,020206 networking & telecommunications ,Salt-and-pepper noise ,White noise ,Filter (signal processing) ,Image Enhancement ,Charts ,Boats ,Laboratory Equipment ,Noise ,Computer Science::Computer Vision and Pattern Recognition ,Physical Sciences ,Engineering and Technology ,020201 artificial intelligence & image processing ,lcsh:Q ,Mathematical Functions ,Algorithm ,Mathematics ,Algorithms ,Research Article - Abstract
In this paper, a two-stage filter for removing salt-and-pepper noise using noise detector based on characteristic difference parameter and adaptive directional mean filter is proposed. The first stage firstly detects the noise corrupted pixels by combining characteristic difference parameter and gray level extreme, then develops an improved adaptive median filter to firstly restore them. The second stage introduces a restoration scheme to further restore the noise corrupted pixels, which firstly divides them into two types and then applies different restoration skills for the pixels based on the classification result. One type of pixels is restored by the mean filter and the other type of pixels is restored by the proposed adaptive directional mean filter. The new filter firstly adaptively selects the optimal filtering window and direction template, then replaces the gray level of noise corrupted pixel by the mean value of pixels on the optimal template. Experimental results show that the proposed filter outperforms many existing main filters in terms of noise suppression and detail preservation.
- Published
- 2018
4. Automatic ICD-10 coding algorithm using an improved longest common subsequence based on semantic similarity.
- Author
-
Chen, YunZhi, Lu, HuiJuan, and Li, LanJuan
- Subjects
VIRAL disease diagnosis ,CHINESE people ,SYMPTOMS ,MEDICAL records ,GASTROENTEROLOGY ,DISEASES - Abstract
ICD-10(International Classification of Diseases 10th revision) is a classification of a disease, symptom, procedure, or injury. Diseases are often described in patients’ medical records with free texts, such as terms, phrases and paraphrases, which differ significantly from those used in ICD-10 classification. This paper presents an improved approach based on the Longest Common Subsequence (LCS) and semantic similarity for automatic Chinese diagnoses, mapping from the disease names given by clinician to the disease names in ICD-10. LCS refers to the longest string that is a subsequence of every member of a given set of strings. The proposed method of improved LCS in this paper can increase the accuracy of processing in Chinese disease mapping. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
5. Risk equivalence as an alternative to balancing mean value when trading draft selections and players in major sporting leagues.
- Author
-
Tuck, Geoffrey N. and Richards, Shane A.
- Subjects
ATHLETIC leagues ,MATHEMATICAL equivalence ,MEAN value theorems ,MARKET value ,DISTRIBUTION (Probability theory) ,PROBABILITY theory - Abstract
In sports leagues that use an annual draft to assign eligible players to clubs, having a value associated with a draft selection can allow clubs to anticipate future growth of players and, if a trading period exists, assist negotiations when exchanging draft selections and players. Typically, mean draft values often decline in either an exponential or geometric manner with increasing draft selection number. Aggregate mean values have been used to compare trade packages. However, clubs may also want to ensure that a trade does not increase the probability of obtaining poor players in the draft. This paper therefore considers equivalence of risk as an alternative trading strategy for club list managers. Here, risk is defined as the probability of the aggregate value of the received draft selections being below a minimum acceptable level. For risk equivalence, a premium over and above mean market value may need to be provided when trading to secure higher draft selections. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
6. Hybrid PSO-FLC for dynamic global peak extraction of the partially shaded photovoltaic system.
- Author
-
Farh, Hassan M. H., Eltamaly, Ali M., and Othman, Mohd F.
- Subjects
PARTICLE swarm optimization ,FUZZY logic ,PHOTOVOLTAIC power systems ,DATA modeling ,POLLINATION - Abstract
Particle Swarm Optimization (PSO) is widely used in maximum power point tracking (MPPT) of photovoltaic (PV) energy systems. Nevertheless, this technique suffers from two main problems in the case of partial shading conditions (PSCs). The first problem is that PSO is a time invariant optimization technique that cannot follow the dynamic global peak (GP) under time variant shading patterns (SPs) and sticks to the first GP that occurs at the beginning. This problem can be solved by dispersing the PSO particles using two new techniques introduced in this paper. The two new proposed PSO re-initialization techniques are to disperse the particles upon the SP changes and the other one is upon a predefined time (PDT). The second problem is regarding the high oscillations around steady state, which can be solved by using fuzzy logic controller (FLC) to fine-tune the output power and voltage from the PV system. The new contribution of this paper is the hybrid PSO-FLC with two PSO particles dispersing techniques that is able to solve the two previous mentioned problems effectively and improve the performance of the PV system in both normal and PSCs. A detailed list of comparisons between hybrid PSO-FLC and original PSO using the two proposed methodologies are achieved. The results prove the superior performance of hybrid PSO-FLC compared to PSO in terms of efficiency, accuracy, oscillations reduction around steady state and soft tuning of the GP tracked. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
7. A novel multi-item joint replenishment problem considering multiple type discounts.
- Author
-
Cui, Ligang, Zhang, Yajun, Deng, Jie, and Xu, Maozeng
- Subjects
PRODUCTION scheduling ,DISCOUNT prices ,ECONOMIC decision making ,HEURISTIC algorithms ,SOCIAL problems - Abstract
In business replenishment, discount offers of multi-item may either provide different discount schedules with a single discount type, or provide schedules with multiple discount types. The paper investigates the joint effects of multiple discount schemes on the decisions of multi-item joint replenishment. In this paper, a joint replenishment problem (JRP) model, considering three discount (all-unit discount, incremental discount, total volume discount) offers simultaneously, is constructed to determine the basic cycle time and joint replenishment frequencies of multi-item. To solve the proposed problem, a heuristic algorithm is proposed to find the optimal solutions and the corresponding total cost of the JRP model. Numerical experiment is performed to test the algorithm and the computational results of JRPs under different discount combinations show different significance in the replenishment cost reduction. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
8. Turning conceptual systems maps into dynamic simulation models: An Australian case study for diabetes in pregnancy.
- Author
-
Freebairn, Louise, Atkinson, Jo-An, Osgood, Nathaniel D., Kelly, Paul M., McDonnell, Geoff, and Rychetnik, Lucie
- Subjects
DYNAMIC simulation ,CONCEPT mapping ,DYNAMIC models ,SIMULATION methods & models ,DYNAMICAL systems ,GESTATIONAL diabetes - Abstract
Background: System science approaches are increasingly used to explore complex public health problems. Quantitative methods, such as participatory dynamic simulation modelling, can mobilise knowledge to inform health policy decisions. However, the analytic and practical steps required to turn collaboratively developed, qualitative system maps into rigorous and policy-relevant quantified dynamic simulation models are not well described. This paper reports on the processes, interactions and decisions that occurred at the interface between modellers and end-user participants in an applied health sector case study focusing on diabetes in pregnancy. Methods: An analysis was conducted using qualitative data from a participatory dynamic simulation modelling case study in an Australian health policy setting. Recordings of participatory model development workshops and subsequent meetings were analysed and triangulated with field notes and other written records of discussions and decisions. Case study vignettes were collated to illustrate the deliberations and decisions made throughout the model development process. Results: The key analytic objectives and decision-making processes included: defining the model scope; analysing and refining the model structure to maximise local relevance and utility; reviewing and incorporating evidence to inform model parameters and assumptions; focusing the model on priority policy questions; communicating results and applying the models to policy processes. These stages did not occur sequentially; the model development was cyclical and iterative with decisions being re-visited and refined throughout the process. Storytelling was an effective strategy to both communicate and resolve concerns about the model logic and structure, and to communicate the outputs of the model to a broader audience. Conclusion: The in-depth analysis reported here examined the application of participatory modelling methods to move beyond qualitative conceptual mapping to the development of a rigorously quantified and policy relevant, complex dynamic simulation model. The analytic objectives and decision-making themes identified provide guidance for interpreting, understanding and reporting future participatory modelling projects and methods. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
9. Multi-criteria decision support for planning and evaluation of performance of viral marketing campaigns in social networks.
- Author
-
Karczmarczyk, Artur, Jankowski, Jarosław, and Wątróbski, Jarosław
- Subjects
SOCIAL network analysis ,INFORMATION processing ,MARKETING ,MULTIPLE criteria decision making ,SIMULATION methods & models - Abstract
The current marketing landscape, apart from conventional approaches, consists of campaigns designed especially for launching information diffusion processes within online networks. Associated research is focused on information propagation models, campaign initialization strategies and factors affecting campaign dynamics. In terms of algorithms and performance evaluation, the final coverage represented by the fraction of activated nodes within a target network is usually used. It is not necessarily consistent with the real marketing campaigns using various characteristics and parameters related to coverage, costs, behavioral patterns and time factors for overall evaluation. This paper presents assumptions for a decision support system for multi-criteria campaign planning and evaluation with inputs from agent-based simulations. The results, which are delivered from a simulation model based on synthetic networks in a form of decision scenarios, are verified within a real network. Last, but not least, the study proposes a multi-objective campaign evaluation framework with several campaign evaluation metrics integrated. The results showed that the recommendations generated with the use of synthetic networks applied to real networks delivered results according to the decision makers’ expectation in terms of the used evaluation criteria. Apart from practical applications, the proposed multi-objective approach creates new evaluation possibilities for theoretical studies focused on information spreading processes within complex networks. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
10. A two-stage filter for removing salt-and-pepper noise using noise detector based on characteristic difference parameter and adaptive directional mean filter.
- Author
-
Ma, Hongjin and Nie, Yufeng
- Subjects
- *
NOISE control , *NOISE pollution , *ACOUSTICAL materials , *NOISE barriers , *ACTIVE noise control - Abstract
In this paper, a two-stage filter for removing salt-and-pepper noise using noise detector based on characteristic difference parameter and adaptive directional mean filter is proposed. The first stage firstly detects the noise corrupted pixels by combining characteristic difference parameter and gray level extreme, then develops an improved adaptive median filter to firstly restore them. The second stage introduces a restoration scheme to further restore the noise corrupted pixels, which firstly divides them into two types and then applies different restoration skills for the pixels based on the classification result. One type of pixels is restored by the mean filter and the other type of pixels is restored by the proposed adaptive directional mean filter. The new filter firstly adaptively selects the optimal filtering window and direction template, then replaces the gray level of noise corrupted pixel by the mean value of pixels on the optimal template. Experimental results show that the proposed filter outperforms many existing main filters in terms of noise suppression and detail preservation. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
11. On designing a new cumulative sum Wilcoxon signed rank chart for monitoring process location.
- Author
-
Abid, Muhammad, Nazir, Hafiz Zafar, Tahir, Muhammad, and Riaz, Muhammad
- Subjects
WILCOXON signed-rank test ,QUALITY control charts ,DATA modeling ,PROBABILITY theory ,INFORMATION design - Abstract
In this paper, ranked set sampling is used for developing a non-parametric location chart which is developed on the basis of Wilcoxon signed rank statistic. The average run length and some other characteristics of run length are used as the measures to assess the performance of the proposed scheme. Some selective distributions including Laplace (or double exponential), logistic, normal, contaminated normal and student’s t-distributions are considered to examine the performance of the proposed Wilcoxon signed rank control chart. It has been observed that the proposed scheme shows superior shift detection ability than some of the competing counterpart schemes covered in this study. Moreover, the proposed control chart is also implemented and illustrated with a real data set. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
12. Significance of likes: Analysing passive interactions on Facebook during campaigning.
- Author
-
Rao, Asha and Khairuddin, Mohammad Adib
- Subjects
- *
SOCIAL interaction , *PASSIVITY (Psychology) , *SOCIAL media , *POLITICAL campaigns , *ELECTIONS , *PSYCHOLOGY ,MALAYSIAN elections - Abstract
With more and more political candidates using social media for campaigning, researchers are looking at measuring the effectiveness of this medium. Most research, however, concentrates on the bare count of likes (or twitter mentions) in an attempt to correlate social media presence and winning. In this paper, we propose a novel method, Interaction Strength Plot (IntS) to measure the passive interactions between a candidate’s posts on Facebook and the users (liking the posts). Using this method on original Malaysian General Election (MGE13) and Australian Federal Elections (AFE13) Facebook Pages (FP) campaign data, we label an FP as performing well if both the posting frequency and the likes gathered are above average. Our method shows that over 60% of the MGE13 candidates and 85% of the AFE13 candidates studied in this paper had under-performing FP. Some of these FP owners would have been identified as popular based on bare count. Thus our performance chart is a vital step forward in measuring the effectiveness of online campaigning. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
13. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.
- Author
-
Rogiers, Bart, Mallants, Dirk, Batelaan, Okke, Gedeon, Matej, Huysmans, Marijke, and Dassargues, Alain
- Subjects
CONE penetration tests ,GROUNDWATER flow ,ALGORITHMS ,BAYESIAN analysis - Abstract
Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km
2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
14. Multiswarm comprehensive learning particle swarm optimization for solving multiobjective optimization problems.
- Author
-
Yu, Xiang and Zhang, Xueqing
- Subjects
PARTICLE swarm optimization ,DIFFERENTIAL evolution ,PARETO analysis ,METAHEURISTIC algorithms ,APPLIED mathematics ,EVOLUTIONARY computation - Abstract
Comprehensive learning particle swarm optimization (CLPSO) is a powerful state-of-the-art single-objective metaheuristic. Extending from CLPSO, this paper proposes multiswarm CLPSO (MSCLPSO) for multiobjective optimization. MSCLPSO involves multiple swarms, with each swarm associated with a separate original objective. Each particle’s personal best position is determined just according to the corresponding single objective. Elitists are stored externally. MSCLPSO differs from existing multiobjective particle swarm optimizers in three aspects. First, each swarm focuses on optimizing the associated objective using CLPSO, without learning from the elitists or any other swarm. Second, mutation is applied to the elitists and the mutation strategy appropriately exploits the personal best positions and elitists. Third, a modified differential evolution (DE) strategy is applied to some extreme and least crowded elitists. The DE strategy updates an elitist based on the differences of the elitists. The personal best positions carry useful information about the Pareto set, and the mutation and DE strategies help MSCLPSO discover the true Pareto front. Experiments conducted on various benchmark problems demonstrate that MSCLPSO can find nondominated solutions distributed reasonably over the true Pareto front in a single run. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
15. The Assessment of Landscape Expressivity: A Free Choice Profiling Approach.
- Author
-
Harding, Stephan P., Burch, Sebastian E., and Wemelsfelder, Françoise
- Subjects
LANDSCAPES ,DIGITAL images ,VISUAL analog scale ,DATA analysis ,MULTIVARIATE analysis - Abstract
In this paper we explore a relational understanding of landscape qualities. We asked three independent groups of human observers to assess the expressive qualities of a range of landscapes in the UK and in Spain, either by means of personal visits or from a projected digital image. We employed a Free Choice Profiling (FCP) methodology, in which observers generated their own descriptive terminologies and then used these to quantify perceived landscape qualities on visual analogue scales. Data were analysed using Generalised Procrustes Analysis, a multivariate statistical technique that does not rely on fixed variables to identify underlying dimensions of assessment. The three observer groups each showed significant agreement, and generated two main consensus dimensions that suggested landscape ‘health’ and ‘development in time’ as common perceived themes of landscape expressivity. We critically discuss these outcomes in context of the landscape assessment literature, and suggest ways forward for further development and research. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
16. From Regional to National Clouds: TV Coverage in the Czech Republic.
- Author
-
Sucháček, Jan, Sed’a, Petr, Friedrich, Václav, Wachowiak-Smolíková, Renata, and Wachowiak, Mark P.
- Subjects
GATEKEEPING ,MASS media ,TELEVISION ,CORRESPONDENCE analysis (Statistics) ,QUANTITATIVE research - Abstract
Media, and particularly TV media, have a great impact on the general public. In recent years, spatial patterns of information and the relevance of intangible geographies have become increasingly important. Gatekeeping plays a critical role in the selection of information that is transformed into media. Therefore, gatekeeping, through national media, also co-forms the generation of mental maps. In this paper, correspondence analysis (a statistical method) combined with cloud lines (a new visual analytics technique) is used to analyze how individual major regional events in one of the post-communist countries, the Czech Republic, penetrate into the media on a national scale. Although national news should minimize distortions about regions, this assumption has not been verified by our research. Impressions presented by the media of selected regions that were markedly influenced by one or several events in those regions demonstrate that gatekeepers, especially news reporters, functioned as a filter by selecting only a few specific, and in many cases, unusual events for dissemination. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
17. Retinal Image Simulation of Subjective Refraction Techniques.
- Author
-
Perches, Sara, Collados, M. Victoria, and Ares, Jorge
- Subjects
RETINAL imaging ,REFRACTION (Optics) ,IMAGE quality analysis ,CONTACT lenses ,SIMULATION methods & models - Abstract
Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient’s response-guided refraction) is the most commonly used approach. In this context, this paper’s main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques—including Jackson’s Cross-Cylinder test (JCC)—relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software’s usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
18. Optimal designs of the side sensitive synthetic chart for the coefficient of variation based on the median run length and expected median run length
- Author
-
Khai Wah Khaw, Waie Chung Yeong, Peh Sang Ng, Sok Li Lim, and Ping Yin Lee
- Subjects
Optimal design ,Computer and Information Sciences ,Percentile ,Markov Models ,Science ,Materials Science ,0211 other engineering and technologies ,Probability density function ,02 engineering and technology ,Infographics ,01 natural sciences ,010104 statistics & probability ,Chart ,Statistics ,Alloys ,Sensitivity (control systems) ,0101 mathematics ,Materials ,Data Management ,Mathematics ,021103 operations research ,Multidisciplinary ,Data Visualization ,Random Variables ,Models, Theoretical ,Probability Theory ,Probability Distribution ,Charts ,Probability Density ,Steel ,Control limits ,Physical Sciences ,Metallurgy ,Probability distribution ,Medicine ,Powders ,Random variable ,Research Article ,Statistical Distributions - Abstract
The side sensitive synthetic chart was proposed to improve the performance of the synthetic chart to monitor shifts in the coefficient of variation (γ), by incorporating the side sensitivity feature where successive non-conforming samples must fall on the same side of the control limits. The existing side sensitive synthetic- γ chart is only evaluated in terms of the average run length (ARL) and expected average run length (EARL). However, the run length distribution is skewed to the right, hence the actual performance of the chart may be frequently different from what is shown by the ARL and EARL. This paper evaluates the entire run length distribution by studying the percentiles of the run length distribution. It is shown that false alarms frequently happen much earlier than the in-control ARL (ARL0), and small shifts are often detected earlier compared to the ARL1. Subsequently, this paper proposes an alternative design based on the median run length (MRL) and expected median run length (EMRL). The optimal design based on the MRL shows smaller out-of-control MRL (MRL1), which shows a quicker detection of the out-of-control condition, compared to the existing design, while the results from the optimal design based on the EMRL is similar to that of the existing designs. Comparisons with the synthetic-γ chart without side sensitivity shows that side sensitivity reduces the median number of samples required to detect a shift and reduces the variability in the run length. Finally, the proposed designs are implemented on an actual industrial example.
- Published
- 2021
19. Process monitoring using inflated beta regression control chart
- Author
-
Fábio M. Bayer, Luiz M. A. Lima-Filho, Tarciana Liberal Pereira, and Tatiene Correia de Souza
- Subjects
Computer and Information Sciences ,Statistical methods ,Science ,0211 other engineering and technologies ,Control variable ,Normal Distribution ,02 engineering and technology ,Linear Regression Analysis ,01 natural sciences ,Infographics ,010104 statistics & probability ,symbols.namesake ,Statistics ,Humans ,Control chart ,0101 mathematics ,Fisher information ,Beta distribution ,Research Errors ,Mathematics ,Data Management ,021103 operations research ,Multidisciplinary ,Models, Statistical ,Data Visualization ,Estimator ,Regression analysis ,Research Assessment ,Probability Theory ,Probability Distribution ,Charts ,Statistical Dispersion ,Regression ,Research and analysis methods ,Monte Carlo method ,Control limits ,Data Interpretation, Statistical ,Physical Sciences ,symbols ,Mathematical and statistical techniques ,Regression Analysis ,Medicine ,Mathematical Functions ,Research Article ,Statistical Distributions - Abstract
This paper provides a general framework for controlling quality characteristics related to control variables and limited to the intervals (0, 1], [0, 1), or [0, 1]. The proposed control chart is based on the inflated beta regression model considering a reparametrization of the inflated beta distribution indexed by the response mean, which is useful for modeling fractions and proportions. The contribution of the paper is twofold. First, we extend the inflated beta regression model by allowing a regression structure for the precision parameter. We also present closed-form expressions for the score vector and Fisher's information matrix. Second, based on the proposed regression model, we introduce a new model-based control chart. The control limits are obtained considering the estimates of the inflated beta regression model parameters. We conduct a Monte Carlo simulation study to evaluate the performance of the proposed regression model estimators, and the performance of the proposed control chart is evaluated in terms of run length distribution. Finally, we present and discuss an empirical application to show the applicability of the proposed regression control chart.
- Published
- 2020
20. Hybrid PSO-FLC for dynamic global peak extraction of the partially shaded photovoltaic system
- Author
-
Mohd Fauzi Othman, Hassan M. H. Farh, and Ali M. Eltamaly
- Subjects
Optimization ,Computer and Information Sciences ,Time Factors ,Computer science ,020209 energy ,lcsh:Medicine ,02 engineering and technology ,Plant Science ,Flowers ,Research and Analysis Methods ,Infographics ,Maximum power point tracking ,Reduction (complexity) ,Fuzzy Logic ,Control theory ,0202 electrical engineering, electronic engineering, information engineering ,Solar Energy ,Computer Simulation ,Pollination ,lcsh:Science ,Extraction Techniques ,Multidisciplinary ,Steady state ,Data Visualization ,Plant Anatomy ,Photovoltaic system ,lcsh:R ,Particle swarm optimization ,Biology and Life Sciences ,Models, Theoretical ,Computing Methods ,Charts ,Power (physics) ,Energy and Power ,Plant Physiology ,Photovoltaic Power ,Physical Sciences ,Engineering and Technology ,lcsh:Q ,Alternative Energy ,Mathematics ,Algorithms ,Voltage ,Research Article - Abstract
Particle Swarm Optimization (PSO) is widely used in maximum power point tracking (MPPT) of photovoltaic (PV) energy systems. Nevertheless, this technique suffers from two main problems in the case of partial shading conditions (PSCs). The first problem is that PSO is a time invariant optimization technique that cannot follow the dynamic global peak (GP) under time variant shading patterns (SPs) and sticks to the first GP that occurs at the beginning. This problem can be solved by dispersing the PSO particles using two new techniques introduced in this paper. The two new proposed PSO re-initialization techniques are to disperse the particles upon the SP changes and the other one is upon a predefined time (PDT). The second problem is regarding the high oscillations around steady state, which can be solved by using fuzzy logic controller (FLC) to fine-tune the output power and voltage from the PV system. The new contribution of this paper is the hybrid PSO-FLC with two PSO particles dispersing techniques that is able to solve the two previous mentioned problems effectively and improve the performance of the PV system in both normal and PSCs. A detailed list of comparisons between hybrid PSO-FLC and original PSO using the two proposed methodologies are achieved. The results prove the superior performance of hybrid PSO-FLC compared to PSO in terms of efficiency, accuracy, oscillations reduction around steady state and soft tuning of the GP tracked.
- Published
- 2018
21. Significance of likes: Analysing passive interactions on Facebook during campaigning
- Author
-
Asha Rao and Mohammad Adib Khairuddin
- Subjects
Male ,Computer and Information Sciences ,Facebook ,Political Science ,Twitter ,Social Sciences ,Interaction strength ,lcsh:Medicine ,030204 cardiovascular system & hematology ,Elections ,Infographics ,Plot (graphics) ,03 medical and health sciences ,0302 clinical medicine ,Sociology ,Chart ,General election ,Humans ,Social media ,030212 general & internal medicine ,lcsh:Science ,Statistical Data ,Multidisciplinary ,Data Visualization ,Politics ,lcsh:R ,Malaysia ,Social Communication ,Advertising ,Charts ,Democracy ,Communications ,Data Acquisition ,Social Networks ,Political Candidates ,Physical Sciences ,Female ,lcsh:Q ,Psychology ,Social Media ,Network Analysis ,Mathematics ,Statistics (Mathematics) ,Research Article - Abstract
With more and more political candidates using social media for campaigning, researchers are looking at measuring the effectiveness of this medium. Most research, however, concentrates on the bare count of likes (or twitter mentions) in an attempt to correlate social media presence and winning. In this paper, we propose a novel method, Interaction Strength Plot (IntS) to measure the passive interactions between a candidate's posts on Facebook and the users (liking the posts). Using this method on original Malaysian General Election (MGE13) and Australian Federal Elections (AFE13) Facebook Pages (FP) campaign data, we label an FP as performing well if both the posting frequency and the likes gathered are above average. Our method shows that over 60% of the MGE13 candidates and 85% of the AFE13 candidates studied in this paper had under-performing FP. Some of these FP owners would have been identified as popular based on bare count. Thus our performance chart is a vital step forward in measuring the effectiveness of online campaigning.
- Published
- 2017
22. Analysis and prediction of unplanned intensive care unit readmission using recurrent neural networks with long short-term memory.
- Author
-
Lin, Yu-Wei, Zhou, Yuqian, Faghri, Faraz, Shaw, Michael J., and Campbell, Roy H.
- Subjects
RECURRENT neural networks ,INTENSIVE care units ,SHORT-term memory ,HOSPITAL administration ,MEDICAL wastes ,SUPERVISED learning - Abstract
Background: Unplanned readmission of a hospitalized patient is an indicator of patients’ exposure to risk and an avoidable waste of medical resources. In addition to hospital readmission, intensive care unit (ICU) readmission brings further financial risk, along with morbidity and mortality risks. Identification of high-risk patients who are likely to be readmitted can provide significant benefits for both patients and medical providers. The emergence of machine learning solutions to detect hidden patterns in complex, multi-dimensional datasets provides unparalleled opportunities for developing an efficient discharge decision-making support system for physicians and ICU specialists. Methods and findings: We used supervised machine learning approaches for ICU readmission prediction. We used machine learning methods on comprehensive, longitudinal clinical data from the MIMIC-III to predict the ICU readmission of patients within 30 days of their discharge. We incorporate multiple types of features including chart events, demographic, and ICD-9 embeddings. We have utilized recent machine learning techniques such as Recurrent Neural Networks (RNN) with Long Short-Term Memory (LSTM), by this we have been able to incorporate the multivariate features of EHRs and capture sudden fluctuations in chart event features (e.g. glucose and heart rate). We show that our LSTM-based solution can better capture high volatility and unstable status in ICU patients, an important factor in ICU readmission. Our machine learning models identify ICU readmissions at a higher sensitivity rate of 0.742 (95% CI, 0.718–0.766) and an improved Area Under the Curve of 0.791 (95% CI, 0.782–0.800) compared with traditional methods. We perform in-depth deep learning performance analysis, as well as the analysis of each feature contribution to the predictive model. Conclusion: Our manuscript highlights the ability of machine learning models to improve our ICU decision-making accuracy and is a real-world example of precision medicine in hospitals. These data-driven solutions hold the potential for substantial clinical impact by augmenting clinical decision-making for physicians and ICU specialists. We anticipate that machine learning models will improve patient counseling, hospital administration, allocation of healthcare resources and ultimately individualized clinical care. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
23. Scoring reading parameters: An inter-rater reliability study using the MNREAD chart.
- Author
-
Baskaran, Karthikeyan, Macedo, Antonio Filipe, He, Yingchen, Hernandez-Moreno, Laura, Queirós, Tatiana, Mansfield, J. Stephen, and Calabrèse, Aurélie
- Abstract
Purpose: First, to evaluate inter-rater reliability when human raters estimate the reading performance of visually impaired individuals using the MNREAD acuity chart. Second, to evaluate the agreement between computer-based scoring algorithms and compare them with human rating. Methods: Reading performance was measured for 101 individuals with low vision, using the Portuguese version of the MNREAD test. Seven raters estimated the maximum reading speed (MRS) and critical print size (CPS) of each individual MNREAD curve. MRS and CPS were also calculated automatically for each curve using two different algorithms: the original standard deviation method (SDev) and a non-linear mixed effects (NLME) modeling. Intra-class correlation coefficients (ICC) were used to estimate absolute agreement between raters and/or algorithms. Results: Absolute agreement between raters was ‘excellent’ for MRS (ICC = 0.97; 95%CI [0.96, 0.98]) and ‘moderate’ to ‘good’ for CPS (ICC = 0.77; 95%CI [0.69, 0.83]). For CPS, inter-rater reliability was poorer among less experienced raters (ICC = 0.70; 95%CI [0.57, 0.80]) when compared to experienced ones (ICC = 0.82; 95%CI [0.76, 0.88]). Absolute agreement between the two algorithms was ‘excellent’ for MRS (ICC = 0.96; 95%CI [0.91, 0.98]). For CPS, the best possible agreement was found for CPS defined as the print size sustaining 80% of MRS (ICC = 0.77; 95%CI [0.68, 0.84]). Absolute agreement between raters and automated methods was ‘excellent’ for MRS (ICC = 0.96; 95% CI [0.88, 0.98] for SDev; ICC = 0.97; 95% CI [0.95, 0.98] for NLME). For CPS, absolute agreement between raters and SDev ranged from ‘poor’ to ‘good’ (ICC = 0.66; 95% CI [0.3, 0.80]), while agreement between raters and NLME was ‘good’ (ICC = 0.83; 95% CI [0.76, 0.88]). Conclusion: For MRS, inter-rater reliability is excellent, even considering the possibility of noisy and/or incomplete data collected in low-vision individuals. For CPS, inter-rater reliability is lower. This may be problematic, for instance in the context of multisite investigations or follow-up examinations. The NLME method showed better agreement with the raters than the SDev method for both reading parameters. Setting up consensual guidelines to deal with ambiguous curves may help improve reliability. While the exact definition of CPS should be chosen on a case-by-case basis depending on the clinician or researcher’s motivations, evidence suggests that estimating CPS as the smallest print size sustaining about 80% of MRS would increase inter-rater reliability. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
24. Forecasting stock prices with a feature fusion LSTM-CNN model using different representations of the same data.
- Author
-
Kim, Taewook and Kim, Ha Young
- Subjects
STOCK prices ,NEURAL circuitry ,TIME series analysis ,DATA analysis ,SHORT-term memory - Abstract
Forecasting stock prices plays an important role in setting a trading strategy or determining the appropriate timing for buying or selling a stock. We propose a model, called the feature fusion long short-term memory-convolutional neural network (LSTM-CNN) model, that combines features learned from different representations of the same data, namely, stock time series and stock chart images, to predict stock prices. The proposed model is composed of LSTM and a CNN, which are utilized for extracting temporal features and image features. We measure the performance of the proposed model relative to those of single models (CNN and LSTM) using SPDR S&P 500 ETF data. Our feature fusion LSTM-CNN model outperforms the single models in predicting stock prices. In addition, we discover that a candlestick chart is the most appropriate stock chart image to use to forecast stock prices. Thus, this study shows that prediction error can be efficiently reduced by using a combination of temporal and image features from the same data rather than using these features separately. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
25. The Light Sword Lens - A novel method of presbyopia compensation: Pilot clinical study.
- Author
-
Petelczyc, Krzysztof, Byszewska, Anna, Chojnacka, Ewelina, Jaroszewicz, Zbigniew, Kakarenko, Karol, Mira-Agudelo, Alejandro, Ostrowska-Spaleniak, Aleksandra, Składowska, Aleksandra, Kołodziejczyk, Andrzej, and Rękas, Marek
- Subjects
PRESBYOPIA ,VISUAL acuity ,DIABETIC retinopathy ,INTRAOCULAR lenses ,TASK performance ,CLINICAL trials - Abstract
Purpose: Clinical assessment of a new optical element for presbyopia correction–the Light Sword Lens. Methods: Healthy dominant eyes of 34 presbyopes were examined for visual performance in 3 trials: reference (with lens for distance correction); stenopeic (distance correction with a pinhole ϕ = 1.25 mm) and Light Sword Lens (distance correction with a Light Sword Lens). In each trial, visual acuity was assessed in 7 tasks for defocus from 0.2D to 3.0D while contrast sensitivity in 2 tasks for defocus 0.3D and 2.5D. The Early Treatment Diabetic Retinopathy Study protocol and Pelli-Robson method were applied. Within visual acuity and contrast sensitivity results degree of homogeneity through defocus was determined. Reference and stenopeic trials were compared to Light Sword Lens results. Friedman analysis of variance, Nemenyi post-hoc, Wilcoxon tests were used, p-value < 0.05 was considered significant. Results: In Light Sword Lens trial visual acuity was stable in tested defocus range [20/25–20/32], Stenopeic trial exhibited a limited range of degradation [20/25–20/40]. Light Sword Lens and reference trials contrast sensitivity was high [1.9–2.0 logCS] for both defocus cases, but low in stenopeic condition [1.5–1.7 logCS]. Between-trials comparisons of visual acuity results showed significant differences only for Light Sword Lens versus reference trials and in contrast sensitivity only for Light Sword Lens versus stenopeic trials. Conclusions: Visual acuity achieved with Light Sword Lens correction in presbyopic eye is comparable to stenopeic but exhibits none significant loss in contrast sensitivity. Such correction method seems to be very promising for novel contact lenses and intraocular lenses design. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
26. Optimisation of children z-score calculation based on new statistical techniques.
- Author
-
Martinez-Millana, Antonio, Hulst, Jessie M., Boon, Mieke, Witters, Peter, Fernandez-Llatas, Carlos, Asseiceira, Ines, Calvo-Lerma, Joaquin, Basagoiti, Ignacio, Traver, Vicente, De Boeck, Kris, and Ribes-Koninckx, Carmen
- Subjects
ANTHROPOMETRY ,MATHEMATICAL models of population ,GAUSSIAN processes ,MATHEMATICAL analysis ,NUMERICAL analysis - Abstract
Background: Expressing anthropometric parameters (height, weight, BMI) as z-score is a key principle in the clinical assessment of children and adolescents. The Centre for Disease Control and Prevention (CDC) growth charts and the CDC-LMS method for z-score calculation are widely used to assess growth and nutritional status, though they can be imprecise in some percentiles. Objective: To improve the accuracy of z-score calculation by revising the statistical method using the original data used to develop current z-score calculators. Design: A Gaussian Process Regressions (GPR) was designed and internally validated. Z-scores for weight-for-age (WFA), height-for-age (HFA) and BMI-for-age (BMIFA) were compared with WHO and CDC-LMS methods in 1) standard z-score cut-off points, 2) simulated population of 3000 children and 3) real observations 212 children aged 2 to 18 yo. Results: GPR yielded more accurate calculation of z-scores for standard cut-off points (p<<0.001) with respect to CDC-LMS and WHO approaches. WFA, HFA and BMIFA z-score calculations based on the 3 different methods using simulated and real patients, showed a large variation irrespective of gender and age. Z-scores around 0 +/- 1 showed larger variation than the values above and below +/- 2. Conclusion: The revised z-score calculation method was more accurate than CDC-LMS and WHO methods for standard cut-off points. On simulated and real data, GPR based calculation provides more accurate z-score determinations, and thus, a better classification of patients below and above cut-off points. Statisticians and clinicians should consider the potential benefits of updating their calculation method for an accurate z-score determination. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
27. Eustachian tube dysfunction: A diagnostic accuracy study and proposed diagnostic pathway.
- Author
-
Smith, Matthew E., Takwoingi, Yemisi, Deeks, Jon, Alper, Cuneyt, Bance, Manohar L., Bhutta, Mahmood F., Donnelly, Neil, Poe, Dennis, and Tysome, James R.
- Subjects
EUSTACHIAN tube ,TREATMENT effectiveness ,REGRESSION analysis ,IMPEDANCE audiometry ,CLINICAL trials - Abstract
Background and aims: Eustachian tube dysfunction (ETD) is a commonly diagnosed disorder of Eustachian tube opening and closure, which may be associated with severe symptoms and middle ear disease. Currently the diagnosis of obstructive and patulous forms of ETD is primarily based on non-specific symptoms or examination findings, rather than measurement of the underlying function of the Eustachian tube. This has proved problematic when selecting patients for treatment, and when designing trial inclusion criteria and outcomes. This study aims to determine the correlation and diagnostic value of various tests of ET opening and patient reported outcome measures (PROMs), in order to generate a recommended diagnostic pathway for ETD. Methods: Index tests included two PROMs and 14 tests of ET opening (nine for obstructive, five for patulous ETD). In the absence of an accepted reference standard two methods were adopted to establish index test accuracy: expert panel diagnosis and latent class analysis. Index test results were assessed with Pearson correlation and principle component analysis, and test accuracy was determined. Logistic regression models assessed the predictive value of grouped test results. Results: The expert panel diagnosis and PROMs results correlated with each other, but not with ET function measured by tests of ET opening. All index tests were found to be feasible in clinic, and acceptable to patients. PROMs had very poor specificity, and no diagnostic value. Combining the results of tests of ET function appeared beneficial. The latent class model suggested tympanometry, sonotubometry and tubomanometry have the best diagnostic performance for obstructive ETD, and these are included in a proposed diagnostic pathway. Conclusions: ETD should be diagnosed on the basis of clinical assessment and tests of ET opening, as PROMs have no diagnostic value. Currently diagnostic uncertainty exists for some patients who appear to have intermittent ETD clinically, but have negative index test results. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
28. Incidence, causes, and consequences of preventable adverse drug reactions occurring in inpatients: A systematic review of systematic reviews.
- Author
-
Wolfe, Dianna, Yazdi, Fatemeh, Kanji, Salmaan, Burry, Lisa, Beck, Andrew, Butler, Claire, Esmaeilisaraji, Leila, Hamel, Candyce, Hersi, Mona, Skidmore, Becky, Moher, David, and Hutton, Brian
- Subjects
DRUG side effects ,MEDICAL care costs ,CLINICAL trials ,META-analysis ,PARAMETER estimation - Abstract
Background: Preventable adverse drug reactions (PADRs) in inpatients are associated with harm, including increased length of stay and potential loss of life, and result in elevated costs of care. We conducted an overview of reviews (i.e., a systematic review of systematic reviews) to determine the incidence of PADRs experienced by inpatients. Secondary review objectives were related to assessment of the effects of patient age, setting, and clinical specialty on PADR incidence. Methods: The protocol was registered in PROSPERO (CRD42016043220). We performed a search of Medline, Embase, and the Cochrane Library, limiting languages of publication to English and French. We included published systematic reviews that reported quantitative data on the incidence of PADRs in patients receiving acute or ambulatory care in a hospital setting. The full texts of all primary studies for which PADR data were reported in the included reviews were obtained and data relevant to review objectives were extracted. Quality of the included reviews was assessed using the AMSTAR-2 tool. Both narrative summaries of findings and meta-analyses of primary study data were undertaken. Results: Thirteen systematic reviews encompassing 37 unique primary studies were included. Across primary studies, the PADR incidence was highly varied, ranging from 0.006 to 13.3 PADRs per 100 patients, with a pooled incidence estimate of 0.59 PADRs per 100 patients. Substantial heterogeneity was present across both reviews and primary studies with respect to review/study objectives, patient age, hospital setting, medical discipline, definitions and assessment tools used, event detection methods, endpoints of interest, and units of measure. Thirteen primary studies used prospective event detection methods and had a pooled PADR incidence of 3.13 (2.87–3.38) PADRs per 100 patients; however, extreme statistical heterogeneity (I
2 = 97%) indicated this finding should be considered with caution. Subgroup meta-analyses demonstrated that PADR incidence varied significantly with event detection method (prospective > retrospective > voluntary reporting methods), hospital setting (ICU > wards), and medical discipline (medical > surgical). High statistical heterogeneity (I2 > 80%) was present across all analyses, indicating results should be interpreted with caution. Effects of patient age could not be assessed due to poor reporting of age groups used in primary studies. Discussion: The method of event detection appeared to significantly influence PADR incidence, with prospective methods having the highest reported PADR rate. This finding is in agreement with the background literature. High methodological and statistical heterogeneity across primary studies evaluating adverse drug events reduces the validity of the overall PADR incidence derived from the meta-analyses of the pooled data. Data pooled from studies using only prospective methods of event detection should provide an overall estimate closest to the true PADR incidence; however, our estimate should be considered with caution due to the statistical heterogeneity found in this group of studies. Future studies should employ prospective methods of detection. This review demonstrates that the true overall incidence of PADRs is likely much greater than the overall pooled incidence estimate of 0.59 PADRs per 100 patients obtained when event detection method was not taken into consideration. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
29. Multidimensional mechanics: Performance mapping of natural biological systems using permutated radar charts.
- Author
-
Porter, Michael M. and Niksiar, Pooya
- Subjects
BIOLOGICAL systems ,BIOMECHANICS ,STIFFNESS (Engineering) ,RADAR ,PERMUTATIONS ,VISUALIZATION - Abstract
Comparing the functional performance of biological systems often requires comparing multiple mechanical properties. Such analyses, however, are commonly presented using orthogonal plots that compare N ≤ 3 properties. Here, we develop a multidimensional visualization strategy using permutated radar charts (radial, multi-axis plots) to compare the relative performance distributions of mechanical systems on a single graphic across N ≥ 3 properties. Leveraging the fact that radar charts plot data in the form of closed polygonal profiles, we use shape descriptors for quantitative comparisons. We identify mechanical property-function correlations distinctive to rigid, flexible, and damage-tolerant biological materials in the form of structural ties, beams, shells, and foams. We also show that the microstructures of dentin, bone, tendon, skin, and cartilage dictate their tensile performance, exhibiting a trade-off between stiffness and extensibility. Lastly, we compare the feeding versus singing performance of Darwin’s finches to demonstrate the potential of radar charts for multidimensional comparisons beyond mechanics of materials. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
30. Risk-adjusted monitoring of surgical performance.
- Author
-
Li, Jianbo, Jiang, Jiancheng, Jiang, Xuejun, and Liu, Lin
- Subjects
CARDIAC surgery ,CUSUM technique ,LOGISTIC regression analysis ,APPROXIMATION theory ,MAXIMUM likelihood statistics - Abstract
We propose a nonparametric risk-adjusted cumulative sum chart to monitor surgical outcomes for patients with different risks of post-operative mortality due to risk factors that exist before the surgery. Using varying-coefficient logistic regression models, we accomplish the risk adjustment. Unknown coefficient functions are estimated by global polynomial spline approximation based on the maximum likelihood principle. We suggest a bisection minimization approach and a bootstrap method to determine the chart testing limit value. Compared with the previous (parametric) risk-adjusted cumulative sum chart, a major advantage of our method is that the morality rate can be modeled more flexibly by related covariates, which significantly enhances the monitoring efficiency. Simulations demonstrate nice performance of our proposed procedure. An application to a UK cardiac surgery dataset illustrates the use of our methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
31. ODM Data Analysis—A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.
- Author
-
Brix, Tobias Johannes, Bruland, Philipp, Sarfraz, Saad, Ernsting, Jan, Neuhaus, Philipp, Storck, Michael, Doods, Justin, Ständer, Sonja, and Dugas, Martin
- Subjects
DESCRIPTIVE statistics ,ELECTRONIC health records ,DATA analysis ,MEDICAL research ,CLINICAL trials - Abstract
Introduction: A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. Methods: The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application’s performance and functionality. Results: The system is implemented as an open source web application (available at ) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Discussion: Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
32. An alternative application of Rasch analysis to assess data from ophthalmic patient-reported outcome instruments.
- Author
-
McNeely, Richard N., Moutari, Salissou, Arba-Mosquera, Samuel, Verma, Shwetabh, and Moore, Jonathan E.
- Subjects
MAXIMUM likelihood statistics ,GLARE ,OPHTHALMOLOGY ,RASCH models ,QUESTIONNAIRES - Abstract
Purpose: To highlight the potential shortcomings associated with the current use Rasch analysis for validation of ophthalmic questionnaires, and to present an alternative application of Rasch analysis to derive insights specific to the cohort of patients under investigation. Methods: An alternative application of Rasch analysis was used to investigate the quality of vision (QoV) for a cohort of 481 patients. Patients received multifocal intraocular lenses and completed a QoV questionnaire one and twelve months post-operatively. The rating scale variant of the polytomous Rasch model was utilized. The parameters of the model were estimated using the joint maximum likelihood estimation. Analysis was performed on data at both post-operative assessments, and the outcomes were compared. Results: The distribution of the location of symptoms altered between assessments with the most annoyed patients completely differing. One month post-operatively, the most prevalent symptom was starbursts compared to glare at twelve months. The visual discomfort from the most annoyed patients is substantially higher at twelve months. The current most advocated approach for validating questionnaires using Rasch analysis found that the questionnaire was “Rasch-valid” one month post-operatively and “Rasch-invalid” twelve months post-operatively. Conclusion: The proposed alternative application of Rasch analysis to questionnaires can be used as an effective decision support tool at population and individual level. At population level, this new approach enables one to investigate the prevalence of symptoms across different cohorts of patients. At individual level, the new approach enables one to identify patients with poor QoV over time. This study highlights some of the potential shortcomings associated with the current use of Rasch analysis to validate questionnaires. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
33. An Eigenspace approach for detecting multiple space-time disease clusters: Application to measles hotspots detection in Khyber-Pakhtunkhwa, Pakistan.
- Author
-
Ullah, Sami, Daud, Hanita, Dass, Sarat C., Fanaee-T, Hadi, and Khalil, Alamgir
- Subjects
MEASLES ,SPATIOTEMPORAL processes ,VIRUS diseases ,COMMUNICABLE diseases ,ALGORITHMS - Abstract
Identifying the abnormally high-risk regions in a spatiotemporal space that contains an unexpected disease count is helpful to conduct surveillance and implement control strategies. The EigenSpot algorithm has been recently proposed for detecting space-time disease clusters of arbitrary shapes with no restriction on the distribution and quality of the data, and has shown some promising advantages over the state-of-the-art methods. However, the main problem with the EigenSpot method is that it cannot be adapted to detect more than one spatiotemporal hotspot. This is an important limitation, since, in reality, we may have multiple hotspots, sometimes at the same level of importance. We propose an extension of the EigenSpot algorithm, called Multi-EigenSpot that is able to handle multiple hotspots by iteratively removing previously detected hotspots and re-running the algorithm until no more hotspots are found. In addition, a visualization tool (heatmap) has been linked to the proposed algorithm to visualize multiple clusters with different colors. We evaluated the proposed method using the monthly data on measles cases in Khyber-Pakhtunkhwa, Pakistan (Jan 2016- Dec 2016), and the efficiency was compared with the state-of-the-art methods: EigenSpot and Space-time scan statistic (SaTScan). The results showed the effectiveness of the proposed method for detecting multiple clusters in a spatiotemporal space. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
34. Mediation analysis for logistic regression with interactions: Application of a surrogate marker in ophthalmology.
- Author
-
Jensen, Signe M., Hauger, Hanne, and Ritz, Christian
- Subjects
VISUAL acuity ,OPHTHALMOLOGY ,VISUAL perception ,RETINAL degeneration ,MEDIATION (Statistics) ,LOGISTIC regression analysis - Abstract
Mediation analysis is often based on fitting two models, one including and another excluding a potential mediator, and subsequently quantify the mediated effects by combining parameter estimates from these two models. Standard errors of such derived parameters may be approximated using the delta method. For a study evaluating a treatment effect on visual acuity, a binary outcome, we demonstrate how mediation analysis may conveniently be carried out by means of marginally fitted logistic regression models in combination with the delta method. Several metrics of mediation are estimated and results are compared to findings using existing methods. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
35. Graph-based analysis of brain connectivity in schizophrenia.
- Author
-
Olejarczyk, Elzbieta and Jernajczyk, Wojciech
- Subjects
PEOPLE with schizophrenia ,BRAIN imaging ,BRAIN function localization ,GRAPH theory ,ELECTROENCEPHALOGRAPHY - Abstract
The present study evaluated brain connectivity using electroencephalography (EEG) data from 14 patients with schizophrenia and 14 healthy controls. Phase-Locking Value (PLV), Phase-Lag Index (PLI) and Directed Transfer Function (DTF) were calculated for the original EEG data and following current source density (CSD) transformation, re-referencing using the average reference electrode (AVERAGE) and reference electrode standardization techniques (REST). The statistical analysis of adjacency matrices was carried out using indices based on graph theory. Both CSD and REST reduced the influence of volume conducted currents. The largest group differences in connectivity were observed for the alpha band. Schizophrenic patients showed reduced connectivity strength, as well as a lower clustering coefficient and shorter characteristic path length for both measures of phase synchronization following CSD transformation or REST re-referencing. Reduced synchronization was accompanied by increased directional flow from the occipital region for the alpha band. Following the REST re-referencing, the sources of alpha activity were located at parietal rather than occipital derivations. The results of PLV and DTF demonstrated group differences in fronto-posterior asymmetry following CSD transformation, while for PLI the differences were significant only using REST. The only analysis that identified group differences in inter-hemispheric asymmetry was DTF calculated for REST. Our results suggest that a comparison of different connectivity measures using graph-based indices for each frequency band, separately, may be a useful tool in the study of disconnectivity disorders such as schizophrenia. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
36. Performance analysis of a machine learning flagging system used to identify a group of individuals at a high risk for colorectal cancer.
- Author
-
Kinar, Yaron, Akiva, Pinchas, Choman, Eran, Kariv, Revital, Shalev, Varda, Levin, Bernard, Narod, Steven A., and Goshen, Ran
- Subjects
COLON cancer risk factors ,MEDICAL records ,HEMATOLOGY ,HEMATOLOGIC malignancies ,GASTROINTESTINAL diseases - Abstract
Individuals with colorectal cancer (CRC) have a tendency to intestinal bleeding which may result in mild to severe iron deficiency anemia, but for many colon cancer patients hematological abnormalities are subtle. The fecal occult blood test (FOBT) is used as a pre-screening test whereby those with a positive FOBT are referred to colonscopy. We sought to determine if information contained in the complete blood count (CBC) report coud be processed automatically and used to predict the presence of occult colorectal cancer (CRC) in the setting of a large health services plan. Using the health records of the Maccabi Health Services (MHS) we reviewed CBC reports for 112,584 study subjects of whom 133 were diagnosed with CRC in 2008 and analysed these with the MeScore tool. The odds ratio for being diagnosed with CRC in 2008 was calculated with regards to the MeScore, using cutoff levels of 97% and 99% percentiles. For individuals in the highest one percentile, the odds ratio for CRC was 21.8 (95% CI 13.8 to 34.2). For the majority of the individuals with cancer, CRC was not suspected at the time of the blood draw. Frequent use of anticoagulants, the presence of other gastrointestinal pathologies and non-GI malignancies were assocaitged with false positive MeScores. The MeScore can help identify individuals in the population who would benefit most from CRC screening, including those with no clinical signs or symptoms of CRC. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
37. Manual Wheelchair Skills Training for Community-Dwelling Veterans with Spinal Cord Injury: A Randomized Controlled Trial.
- Author
-
Kirby, R. Lee, Mitchell, Doug, Sabharwal, Sunil, McCranie, Mark, and Nelson, Audrey L.
- Subjects
SPINAL cord injuries ,WHEELCHAIRS ,REHABILITATION centers ,PARTICIPANT observation ,FOLLOW-up studies (Medicine) - Abstract
Objectives: To test the hypotheses that community-dwelling veterans with spinal cord injury (SCI) who receive the Wheelchair Skills Training Program (WSTP) in their own environments significantly improve their manual wheelchair-skills capacity, retain those improvements at one year and improve participation in comparison with an Educational Control (EC) group. Methods: We carried out a randomized controlled trial, studying 106 veterans with SCI from three Veterans Affairs rehabilitation centers. Each participant received either five one-on-one WSTP or EC sessions 30–45 minutes in duration. The main outcome measures were the total and subtotal percentage capacity scores from the Wheelchair Skills Test 4.1 (WST) and Craig Handicap Assessment and Reporting Technique (CHART) scores. Results: Participants in the WSTP group improved their total and Advanced-level WST scores by 7.1% and 30.1% relative to baseline (p < 0.001) and retained their scores at one year follow-up. The success rates for individual skills were consistent with the total and subtotal WST scores. The CHART Mobility sub-score improved by 3.2% over baseline (p = 0.021). Conclusions: Individualized wheelchair skills training in the home environment substantially improves the advanced and total wheelchair skills capacity of experienced community-dwelling veterans with SCI but has only a small impact on participation. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
38. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.
- Author
-
Xu, Haiyang and Wang, Ping
- Subjects
DRONE aircraft ,FLIGHT control systems ,AIRWORTHINESS certificates ,DYNAMIC models ,REAL-time control - Abstract
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
39. Detecting Visual Function Abnormality with a Contrast-Dependent Visual Test in Patients with Type 2 Diabetes.
- Author
-
Tsai, Li-Ting, Liao, Kuo-Meng, Jang, Yuh, Hu, Fu-Chang, and Wu, Wei-Chi
- Subjects
DIABETIC retinopathy treatment ,DIABETIC retinopathy ,VISUAL perception ,HYPERTENSION ,MEDICAL screening ,REGRESSION analysis ,PATIENTS - Abstract
In addition to diabetic retinopathy, diabetes also causes early retinal neurodegeneration and other eye problems, which cause various types of visual deficits. This study used a computer-based visual test (Macular Multi-Function Assessment (MMFA)) to assess contrast-dependent macular visual function in patients with type 2 diabetes to collect more visual information than possible with only the visual acuity test. Because the MMFA is a newly developed test, this study first compared the agreement and discriminative ability of the MMFA and the Early Treatment Diabetic Retinopathy Study (ETDRS) contrast acuity charts. Then symbol discrimination performances of diabetic patients and controls were evaluated at 4 contrast levels using the MMFA. Seventy-seven patients and 45 controls participated. The agreement between MMFA and ETDRS scores was examined by fitting three-level linear mixed-effect models to estimate the intraclass correlation coefficients (ICCs). The estimated areas under the receiver operating characteristic (ROC) curve were used to compare the discriminative ability of diseased versus non-diseased participants between the two tests. The MMFA scores of patients and controls were compared with multiple linear regression analysis after adjusting the effects of age, sex, hypertension and cataract. Results showed that the scores of the MMFA and ETDRS tests displayed high levels of agreement and acceptable and similar discriminative ability. The MMFA performance was correlated with the severity of diabetic retinopathy. Most of the MMFA scores differed significantly between the diabetic patients and controls. In the low contrast condition, the MMFA scores were significantly lower for 006Eon-DR patients than for controls. The potential utility of the MMFA as an easy screening tool for contrast-dependent visual function and for detecting early functional visual change in patients with type 2 diabetes is discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
40. Is It Time to Change Our Reference Curve for Femur Length? Using the Z-Score to Select the Best Chart in a Chinese Population.
- Author
-
Li, Boya, Yang, Huixia, Wei, Yumei, Su, Rina, Wang, Chen, Meng, Wenying, Wang, Yongqing, Shang, Lixin, Cai, Zhenyu, Ji, Liping, Wang, Yunfeng, Sun, Ying, Liu, Jiaxiu, Wei, Li, Sun, Yufeng, Zhang, Xueying, Luo, Tianxia, Chen, Haixia, and Yu, Lijun
- Subjects
FEMUR ,DELIVERY (Obstetrics) ,HEALTH of Chinese people ,GESTATIONAL age ,GAUSSIAN distribution - Abstract
Objective: To use Z-scores to compare different charts of femur length (FL) applied to our population with the aim of identifying the most appropriate chart. Methods: A retrospective study was conducted in Beijing. Fifteen hospitals in Beijing were chosen as clusters using a systemic cluster sampling method, in which 15,194 pregnant women delivered from June 20th to November 30th, 2013. The measurements of FL in the second and third trimester were recorded, as well as the last measurement obtained before delivery. Based on the inclusion and exclusion criteria, we identified FL measurements from 19996 ultrasounds from 7194 patients between 11 and 42 weeks gestation. The FL data were then transformed into Z-scores that were calculated using three series of reference equations obtained from three reports: Leung TN, Pang MW et al (2008); Chitty LS, Altman DG et al (1994); and Papageorghiou AT et al (2014). Each Z-score distribution was presented as the mean and standard deviation (SD). Skewness and kurtosis and were compared with the standard normal distribution using the Kolmogorov-Smirnov test. The histogram of their distributions was superimposed on the non-skewed standard normal curve (mean = 0, SD = 1) to provide a direct visual impression. Finally, the sensitivity and specificity of each reference chart for identifying fetuses <5th or >95th percentile (based on the observed distribution of Z-scores) were calculated. The Youden index was also listed. A scatter diagram with the 5
th , 50th , and 95th percentile curves calculated from and superimposed on each reference chart was presented to provide a visual impression. Results: The three Z-score distribution curves appeared to be normal, but none of them matched the expected standard normal distribution. In our study, the Papageorghiou reference curve provided the best results, with a sensitivity of 100% for identifying fetuses with measurements < 5th and > 95th percentile, and specificities of 99.9% and 81.5%, respectively. Conclusions: It is important to choose an appropriate reference curve when defining what is normal. The Papageorghiou reference curve for FL seems to be the best fit for our population. Perhaps it is time to change our reference curve for femur length. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
41. Discrepancy and Disliking Do Not Induce Negative Opinion Shifts.
- Author
-
Takács, Károly, Flache, Andreas, and Mäs, Michael
- Subjects
SOCIAL psychology ,OPINION (Philosophy) ,DISCREPANCY theorem ,SOCIAL influence ,LIKES & dislikes ,EMPIRICAL research - Abstract
Both classical social psychological theories and recent formal models of opinion differentiation and bi-polarization assign a prominent role to negative social influence. Negative influence is defined as shifts away from the opinion of others and hypothesized to be induced by discrepancy with or disliking of the source of influence. There is strong empirical support for the presence of positive social influence (a shift towards the opinion of others), but evidence that large opinion differences or disliking could trigger negative shifts is mixed. We examine positive and negative influence with controlled exposure to opinions of other individuals in one experiment and with opinion exchange in another study. Results confirm that similarities induce attraction, but results do not support that discrepancy or disliking entails negative influence. Instead, our findings suggest a robust positive linear relationship between opinion distance and opinion shifts. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
42. Bayesian Top-Down Protein Sequence Alignment with Inferred Position-Specific Gap Penalties.
- Author
-
Neuwald, Andrew F. and Altschul, Stephen F.
- Subjects
MARKOV processes ,AMINO acid sequence ,SEQUENCE alignment ,BAYESIAN analysis ,DIRICHLET principle - Abstract
We describe a Bayesian Markov chain Monte Carlo (MCMC) sampler for protein multiple sequence alignment (MSA) that, as implemented in the program GISMO and applied to large numbers of diverse sequences, is more accurate than the popular MSA programs MUSCLE, MAFFT, Clustal-Ω and Kalign. Features of GISMO central to its performance are: (i) It employs a “top-down” strategy with a favorable asymptotic time complexity that first identifies regions generally shared by all the input sequences, and then realigns closely related subgroups in tandem. (ii) It infers position-specific gap penalties that favor insertions or deletions (indels) within each sequence at alignment positions in which indels are invoked in other sequences. This favors the placement of insertions between conserved blocks, which can be understood as making up the proteins’ structural core. (iii) It uses a Bayesian statistical measure of alignment quality based on the minimum description length principle and on Dirichlet mixture priors. Consequently, GISMO aligns sequence regions only when statistically justified. This is unlike methods based on the ad hoc, but widely used, sum-of-the-pairs scoring system, which will align random sequences. (iv) It defines a system for exploring alignment space that provides natural avenues for further experimentation through the development of new sampling strategies for more efficiently escaping from suboptimal traps. GISMO’s superior performance is illustrated using 408 protein sets containing, on average, 235 sequences. These sets correspond to NCBI Conserved Domain Database alignments, which have been manually curated in the light of available crystal structures, and thus provide a means to assess alignment accuracy. GISMO fills a different niche than other MSA programs, namely identifying and aligning a conserved domain present within a large, diverse set of full length sequences. The GISMO program is available at . [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
43. The Value of Contrast-Enhanced Ultrasonography and Contrast-Enhanced CT in the Diagnosis of Malignant Renal Cystic Lesions: A Meta-Analysis.
- Author
-
Lan, Dong, Qu, Hong-Chen, Li, Ning, Zhu, Xing-Wang, Liu, Yi-Li, and Liu, Chun-Lai
- Subjects
CYSTIC kidney disease ,CONTRAST-enhanced ultrasound ,LIKELIHOOD ratio tests ,RECEIVER operating characteristic curves ,META-analysis ,DIAGNOSIS - Abstract
We compared the efficacy of contrast-enhanced ultrasound (CEUS) and contrast-enhanced computed tomography (CECT) for the diagnosis of renal cystic lesions via a meta-analysis to determine the value of CEUS in the prediction of the malignant potential of complex renal cysts. Eleven studies were evaluated: 4 control studies related to CEUS and CECT, 3 studies related to CEUS and 4 studies related to CECT. According to the random effects model, the pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio for CEUS/CECT were 0.95/0.90, 0.79/0.85, 4.39/5.00, and 0.10/0.15, respectively. The areas under the summary receiver operating characteristic (AUCs-SROC) curves for the two methods were 94.24% and 93.39%, and the estimated Q values were 0.8805 and 0.8698, respectively. Comparing the Q index values of CEUS and CECT revealed no significant difference between the two methods (P>0.05). When compared with conventional CECT, CEUS is also useful for diagnosing renal cystic lesions in the clinic. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
44. Comparing the Effectiveness of Bevacizumab to Ranibizumab in Patients with Exudative Age-Related Macular Degeneration. The BRAMD Study.
- Author
-
Schauwvlieghe, A. M. E., Dijkman, G., Hooymans, J. M., Verbraak, F. D., Hoyng, C. B., Dijkgraaf, M. G. W., Peto, T., Vingerling, J. R., and Schlingemann, R. O.
- Subjects
BEVACIZUMAB ,RETINAL degeneration treatment ,RANIBIZUMAB ,DRUG efficacy ,VISUAL acuity ,CLINICAL trials ,THERAPEUTICS - Abstract
Purpose: To compare the effectiveness of bevacizumab and ranibizumab in the treatment of exudative age-related macular degeneration (AMD). Design: Multicentre, randomized, controlled, double-masked clinical trial in 327 patients. The non-inferiority margin was 4 letters. Patients: Patients ≥ 60 years of age with primary or recurrent sub- or juxtafoveal choroidal neovascularization (CNV) secondary to AMD with a total area of CNV < 12 disc areas and a best corrected visual acuity (BCVA) score between 20 and 78 letters on an EDTRS like chart in the study eye. Methods: Monthly intravitreal injections with 1.25 mg bevacizumab or 0.5 mg ranibizumab were given during one year. Intention to treat with last observation carried forward analysis was performed. Main Outcome Measures: Primary outcome was the change in BCVA in the study eye from baseline to 12 months. Results: The mean gain in BCVA was 5.1 (±14.1) letters in the bevacizumab group (n = 161) and 6.4 (±12.2) letters in the ranibizumab group (n = 166) (p = 0.37). The lower limit of the 95% confidence interval of the difference in BCVA gain was 3.72. The response to bevacizumab was more varied; 24% of patients showed a gain of ≥15 letters, 11% a loss of ≥15 letters and 65% a gain or loss < 15 letters compared to 19%, 5% and 76% respectively for ranibizumab (p = 0.038). No significant differences in absolute CRT and CRT change (p = 0.13) or in the presence of subretinal or intraretinal fluid (p = 0.14 and 0.10, respectively) were observed. However, the presence of any fluid on SD-OCT (subretinal and/or intraretinal) differed significantly (p = 0.020), with definite fluid on SD-OCT in 45% of the patients for bevacizumab versus 31% for ranibizumab. The occurrence of serious adverse events and adverse events was similar, with 34 SAEs and 256 AEs in the bevacizumab group and 37 SAEs and 299 AEs in the ranibizumab group (p = 0.87 and p = 0.48, respectively). Conclusions: Bevacizumab was not inferior to ranibizumab. The response to bevacizumab was more varied with higher percentages of both gainers and losers and more frequently observed retinal fluid on SD-OCT at 12 months when compared to the ranibizumab group. Trial Registration: Trialregister.nl [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
45. Indoor Spatial Updating with Reduced Visual Information.
- Author
-
Legge, Gordon E., Gage, Rachel, Baek, Yihwa, and Bochsler, Tiana M.
- Subjects
SPATIAL ability ,VISUAL perception ,INFORMATION processing ,ORIENTATION physiology ,PEOPLE with visual disabilities ,VISUAL acuity - Abstract
Purpose: Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Methods: Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (Snellen 20/135) and Severe Blur (Snellen 20/900) conditions, and a Narrow Field (8°) condition. The subjects estimated the dimensions of seven rectangular rooms with and without these visual restrictions. They were also guided along three-segment paths in the rooms. At the end of each path, they were asked to estimate the distance and direction to the starting location. In Experiment 1, the subjects walked along the path. In Experiment 2, they were pushed in a wheelchair to determine if reduced proprioceptive input would result in poorer spatial updating. Results: With unrestricted vision, mean Weber fractions for room-size estimates were near 20%. Severe Blur but not Mild Blur yielded larger errors in room-size judgments. The Narrow Field was associated with increased error, but less than with Severe Blur. There was no effect of visual restriction on estimates of distance back to the starting location, and only Severe Blur yielded larger errors in the direction estimates. Contrary to expectation, the wheelchair subjects did not exhibit poorer updating performance than the walking subjects, nor did they show greater dependence on visual condition. Discussion: If our results generalize to people with low vision, severe deficits in acuity or field will adversely affect the ability to judge the size of indoor spaces, but updating of position and orientation may be less affected by visual impairment. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
46. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.
- Author
-
Mezzenga, Emilio, D’Errico, Vincenzo, Sarnelli, Anna, Strigari, Lidia, Menghi, Enrico, Marcocci, Francesco, Bianchini, David, and Benassi, Marcello
- Subjects
RETROSPECTIVE studies ,STATISTICAL process control ,PROBABILITY theory ,COMPUTERS in medicine ,RADIOTHERAPY - Abstract
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
47. Reference Charts for Fetal Cerebellar Vermis Height: A Prospective Cross-Sectional Study of 10605 Fetuses.
- Author
-
Cignini, Pietro, Giorlandino, Maurizio, Brutti, Pierpaolo, Mangiafico, Lucia, Aloisi, Alessia, and Giorlandino, Claudio
- Subjects
CEREBELLUM diseases ,FETAL diseases ,GESTATIONAL age ,BIOMETRY ,REGRESSION analysis ,CROSS-sectional method ,LONGITUDINAL method ,DIAGNOSIS - Abstract
Objective: To establish reference charts for fetal cerebellar vermis height in an unselected population. Methods: A prospective cross-sectional study between September 2009 and December 2014 was carried out at ALTAMEDICA Fetal–Maternal Medical Centre, Rome, Italy. Of 25203 fetal biometric measurements, 12167 (48%) measurements of the cerebellar vermis were available. After excluding 1562 (12.8%) measurements, a total of 10605 (87.2%) fetuses were considered and analyzed once only. Parametric and nonparametric quantile regression models were used for the statistical analysis. In order to evaluate the robustness of the proposed reference charts regarding various distributional assumptions on the ultrasound measurements at hand, we compared the gestational age-specific reference curves we produced through the statistical methods used. Normal mean height based on parametric and nonparametric methods were defined for each week of gestation and the regression equation expressing the height of the cerebellar vermis as a function of gestational age was calculated. Finally the correlation between dimension/gestation was measured. Results: The mean height of the cerebellar vermis was 12.7mm (SD, 1.6mm; 95% confidence interval, 12.7–12.8mm). The regression equation expressing the height of the CV as a function of the gestational age was: height (mm) = -4.85+0.78 x gestational age. The correlation between dimension/gestation was expressed by the coefficient r = 0.87. Conclusion: This is the first prospective cross-sectional study on fetal cerebellar vermis biometry with such a large sample size reported in literature. It is a detailed statistical survey and contains new centile-based reference charts for fetal height of cerebellar vermis measurements. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
48. Relationship between Functional Visual Acuity and Useful Field of View in Elderly Drivers.
- Author
-
Negishi, Kazuno, Masui, Sachiko, Mimura, Masaru, Fujita, Yoshio, and Tsubota, Kazuo
- Subjects
VISUAL acuity ,PSYCHOLOGY of automobile drivers ,MEDICAL screening ,BINOCULAR vision ,COGNITIVE ability - Abstract
Purpose: To investigate the relationship between the functional visual acuity (FVA) and useful field of view (UFOV) in elderly drivers and assess the usefulness of the FVA test to screen driving aptitude in elderly drivers. Methods: This study included 45 elderly drivers (31 men, 14 women; mean age, 68.1 years) and 30 younger drivers (26 men, 4 women; mean age, 34.2 years) who drive regularly. All participants underwent measurement of the binocular corrected distant visual acuity (CDVA), binocular corrected distant FVA (CDFVA), and Visual Field with Inhibitory Tasks Elderly Version (VFIT-EV) to measure UFOV. The tear function and cognitive status also were evaluated. Results: The CDVA, the CDFVA, cognitive status, and the correct response rate (CAR) of the VFIT-EV were significantly worse in the elderly group than in the control group (P = 0.000 for all parameters). The cognitive status was correlated significantly with the CDVA (r = -0.301, P = 0.009), CDFVA (r = -0.402, P = 0.000), and the CAR of the VFIT-EV (r = 0.348, P = 0.002) in all subjects. The results of the tear function tests were not correlated with the CDVA, CDFVA, or VFIT-EV in any subjects. Stepwise regression analysis for all subjects in the elderly and control groups showed that the CDFVA predicted the CAR most significantly among the clinical factors evaluated. Conclusion: The FVA test is a promising method to screen the driving aptitude, including both visual and cognitive functions, in a short time. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
49. Risk equivalence as an alternative to balancing mean value when trading draft selections and players in major sporting leagues
- Author
-
Geoffrey N. Tuck and Shane A. Richards
- Subjects
Economics ,Social Sciences ,Infographics ,Hull ,Salaries ,Psychology ,Trading strategy ,050207 economics ,Market value ,Equivalence (measure theory) ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) ,050205 econometrics ,media_common ,Multidisciplinary ,Careers ,05 social sciences ,Charts ,Sports Science ,Negotiation ,Physical Sciences ,Medicine ,Club ,Games ,Research Article ,Sports ,Statistical Distributions ,Employment ,Adult ,Computer and Information Sciences ,media_common.quotation_subject ,Science ,League ,Athletic Performance ,0502 economics and business ,Humans ,Personnel Selection ,Behavior ,Actuarial science ,Data Visualization ,Mean value ,Biology and Life Sciences ,Probability Theory ,Probability Distribution ,Athletes ,Labor Economics ,Recreation ,Business ,Mathematics ,Finance - Abstract
In sports leagues that use an annual draft to assign eligible players to clubs, having a value associated with a draft selection can allow clubs to anticipate future growth of players and, if a trading period exists, assist negotiations when exchanging draft selections and players. Typically, mean draft values often decline in either an exponential or geometric manner with increasing draft selection number. Aggregate mean values have been used to compare trade packages. However, clubs may also want to ensure that a trade does not increase the probability of obtaining poor players in the draft. This paper therefore considers equivalence of risk as an alternative trading strategy for club list managers. Here, risk is defined as the probability of the aggregate value of the received draft selections being below a minimum acceptable level. For risk equivalence, a premium over and above mean market value may need to be provided when trading to secure higher draft selections.
- Published
- 2019
50. Optimisation of children z-score calculation based on new statistical techniques
- Author
-
Peter Witters, Kris De Boeck, Mieke Boon, I. Asseiceira, Antonio Martinez-Millana, Carlos Fernandez-Llatas, Carmen Ribes-Koninckx, Jessie M. Hulst, Joaquin Calvo-Lerma, Vicente Traver, Ignacio Basagoiti, and Pediatrics
- Subjects
Male ,Percentile ,Infographics ,Pediatrics ,Body Mass Index ,Families ,Mathematical and Statistical Techniques ,0302 clinical medicine ,ADOLESCENTS ,Statistics ,Medicine and Health Sciences ,030212 general & internal medicine ,Child ,Children ,Statistical Data ,Mathematics ,2. Zero hunger ,education.field_of_study ,Multidisciplinary ,Anthropometry ,Mathematical model ,VALUES ,Mathematical Models ,Nutritional status ,Regression analysis ,CURVES ,Charts ,3. Good health ,Multidisciplinary Sciences ,Child, Preschool ,Physical Sciences ,symbols ,Science & Technology - Other Topics ,GROWTH ,Regression Analysis ,Probability distribution ,Medicine ,ADIPOSITY ,Female ,Anatomy ,Research Article ,Statistical Distributions ,Computer and Information Sciences ,TECNOLOGIA DE ALIMENTOS ,Science ,Population ,Nutritional Status ,030209 endocrinology & metabolism ,Standard score ,Research and Analysis Methods ,TECNOLOGIA ELECTRONICA ,03 medical and health sciences ,symbols.namesake ,Humans ,education ,Gaussian process ,Nutrition ,Science & Technology ,Data Visualization ,Body Weight ,Infant, Newborn ,Infant ,Biology and Life Sciences ,Probability Theory ,Body Height ,Age Groups ,People and Places ,Population Groupings - Abstract
[EN] Background Expressing anthropometric parameters (height, weight, BMI) as z-score is a key principle in the clinical assessment of children and adolescents. The Centre for Disease Control and Prevention (CDC) growth charts and the CDC-LMS method for z-score calculation are widely used to assess growth and nutritional status, though they can be imprecise in some percentiles. Objective To improve the accuracy of z-score calculation by revising the statistical method using the original data used to develop current z-score calculators. Design A Gaussian Process Regressions (GPR) was designed and internally validated. Z-scores for weight-for-age (WFA), height-for-age (HFA) and BMI-for-age (BMIFA) were compared with WHO and CDC-LMS methods in 1) standard z-score cut-off points, 2) simulated population of 3000 children and 3) real observations 212 children aged 2 to 18 yo. Results GPR yielded more accurate calculation of z-scores for standard cut-off points (p<, The study presented in this paper was developed in the context of the MyCyFAPP Project, funded by the European Union under the Grant Agreement number 643806. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
- Published
- 2018
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.