31 results on '"Quality Score"'
Search Results
2. Identification of Torquetenovirus Species in Patients with Kawasaki Disease Using a Newly Developed Species-Specific PCR Method.
- Author
-
Spezia PG, Filippini F, Nagao Y, Sano T, Ishida T, and Maggi F
- Subjects
- High-Throughput Nucleotide Sequencing, Datasets as Topic, Humans, Male, Female, Infant, Child, Preschool, Child, Prospective Studies, DNA, Viral genetics, DNA, Viral isolation & purification, Torque teno virus genetics, Torque teno virus isolation & purification, Mucocutaneous Lymph Node Syndrome virology, Polymerase Chain Reaction methods, DNA Virus Infections virology
- Abstract
A next-generation sequencing (NGS) study identified a very high viral load of Torquetenovirus (TTV) in KD patients. We aimed to evaluate the feasibility of a newly developed quantitative species-specific TTV-PCR (ssTTV-PCR) method to identify the etiology of KD. We applied ssTTV-PCR to samples collected from 11 KD patients and 22 matched control subjects who participated in our previous prospective study. We used the NGS dataset from the previous study to validate ssTTV-PCR. The TTV loads in whole blood and nasopharyngeal aspirates correlated highly (Spearman's R = 0.8931, p < 0.0001, n = 33), supporting the validity of ssTTV-PCR. The ssTTV-PCR and NGS results were largely consistent. However, inconsistencies occurred when ssTTV-PCR was more sensitive than NGS, when the PCR primer sequences mismatched the viral sequences in the participants, and when the NGS quality score was low. Interpretation of NGS requires complex procedures. ssTTV-PCR is more sensitive than NGS but may fail to detect a fast-evolving TTV species. It would be prudent to update primer sets using NGS data. With this precaution, ssTTV-PCR can be used reliably in a future large-scale etiological study for KD.
- Published
- 2023
- Full Text
- View/download PDF
3. Regression trees to identify combinations of farming practices that achieve the best overall intrinsic quality of milk.
- Author
-
Rey-Cadilhac L, Ferlay A, Gelé M, Léger S, and Laurent C
- Subjects
- Animals, Farms, Dairying methods, Diet, Milk, Cheese
- Abstract
Many studies over the last 30 years have shown the effects of farming practices on milk compounds. Combinations of practices may have antagonistic or synergistic effects on milk compounds, but these combination effects remain underinvestigated. Research needs to focus on overall intrinsic milk quality (including sensory, technological, health, and nutritional dimensions) and identify the combinations that can optimize it. The aim of this study was to identify which combinations of farming practices achieved the best scores for sensory, technological, health, and nutritional dimensions and for overall intrinsic milk quality. Ninety-nine private farms were visited once each to sample their bulk tank milk and survey their farming practices. The surveyed practices concerned herd characteristics, feeding management, housing conditions, and milking and milk storage conditions on the day of test. Analyses of bulk tank milk were designed to evaluate the overall intrinsic quality of the milk for 2 target products: raw milk cheese and semi-skimmed UHT milk. Regression trees were then used to identify the combinations of farming practices that achieved the best scores on each dimension and on overall intrinsic quality of the milk. Breed and diet (type of forage) were the most influential factors for sensory and health dimensions and for technological and nutritional dimension scores, respectively, in the cheese assessment. Overall cheese quality was highly positively correlated with these 4 dimension scores. Therefore, breed and diet emerged as the most influential practices in the regression tree for overall cheese quality. However, the combinations of practices that resulted in the best quality scores differed according to dimension studied and product targeted. This suggests that advice on farming practices to improve intrinsic milk quality needs to be adapted according to the end-purpose of the collected milk. This innovative approach combining on-farm data and regression trees provides farm managers with a valuable and practical tool to prioritize practices in terms of their role in shaping milk quality, and to identify the combinations of practices that promote good milk quality and practice thresholds or modalities needed to achieve it., (The Authors. Published by Elsevier Inc. and Fass Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).)
- Published
- 2023
- Full Text
- View/download PDF
4. [Evaluation Of The Quality Of Medical Prescriptions Before The Introduction Of A Therapeutic Form: Case Of The University Hospital Center Pr Bocar Sidy SALL Of Kati].
- Author
-
Traoré MDS, Traoré S, Sangho A, Coulibaly I, Diarra A, Coulibaly BF, Bah S, Ouedrago R, and Youl ENH
- Abstract
Objective: The general objective of this work was to evaluate the quality of medical prescriptions at the CHU Bocar SALL of Kati before the introduction of a therapeutic form., Methods: This was a cross-sectional study with prospective collection covering one year (April 2021-March 2022). A simple random sampling was carried out from the prescriptions (n=1283) of the patients coming for outpatient consultations and the files (n=847) of the hospitalized patients., Results: Prescriptions were made mainly by medical specialists, including 468 prescriptions and 612 patient files. The average number of drugs per prescription was 2.66. As for hospitalized patients, they received an average of 5.75 drugs. The "Prescription quality score" obtained an average of 5.19 out of 8 points. A little more than half of the prescriptions were made on the basis of the national list of essential drugs with a rate of 53.31%.The treatments given to the patients were consistent with the diagnoses, with a score of 4.14 out of 5 points., Conclusion: Compliance with the rules of good practice for medical prescriptions not only guarantees the quality of care offered to users, but also allows good planning and control of the establishment's future public health actions., (Le comité de rédaction se réserve le droit de renvoyer aux auteurs avant toute soumission à l'avis des lecteurs les manuscrits qui ne seraient pas conformes à ces modalités de présentation. En outre il leur conseille de conserver un exemplaire du manuscrit, des figures et des tableaux.)
- Published
- 2023
5. Evaluation of the loss of fingermark ridge clarity as a function of biological sex.
- Author
-
Salmeron LC and De Alcaraz-Fossoul J
- Subjects
- Male, Female, Humans, Powders, Touch, Glass, Dermatoglyphics, Sweat
- Abstract
Latent fingermark ridge patterns result from imprinting sweat secretions onto receiving surfaces. However, little is known about the loss of skin moisture between immediate consecutive depositions and its effects on the visual quality of ridges and their degradation over time. In practice, it is recurrently assumed that the first touch should contain the most residue and, therefore, display the highest ridge quality. Also, it is expected to observe a gradual decrease in the quantity of residue deposited and, in turn, in the clarity of ridges. In this study, a total of 480 fingermarks were obtained from 20 donors, 10 males and 10 females, to assess the pattern loss of ridge quality across six successive impressions in a depletion series. Black magnetic powder (BMP) was utilized to visualize and photograph fingermarks on glass microscope slides. After image standardization, Quality Scores (QS) as well as metrics on ridge clarity were obtained from the FBI's Universal Latent Workstation (ULW). Data analyses revealed a significant drop in ridge quality over the six consecutive depositions, but notably after deposition four. No differences in ridge clarity between sexes were detected within the first three depositions although an effect was noted beyond this point. ULW proved to be an excellent and sensitive tool in detecting minute changes in ridge quality across the depletion series. These results may contribute in determining the chronological order of events and support further research in estimating time-since-deposition., (© 2022 American Academy of Forensic Sciences.)
- Published
- 2022
- Full Text
- View/download PDF
6. CSPP-IQA: a multi-scale spatial pyramid pooling-based approach for blind image quality assessment.
- Author
-
Chen J, Qin F, Lu F, Guo L, Li C, Yan K, and Zhou X
- Abstract
The traditional image quality assessment (IQA) methods are usually based on convolutional neural networks (CNNs). For these IQA methods using CNNs, limited by the feature size of the fully connected layer, the input image needs be tailored to a pre-defined size, which usually results in destroying the original structure and content of the input image and thus reduces the accuracy of the quality assessment. In this paper, a blind image quality assessment method (named CSPP-IQA), which is based on multi-scale spatial pyramid pooling, is proposed. CSPP-IQA allows inputting the original image when assessing the image quality without any image adjustment. Moreover, by facilitating the convolutional block attention module and image understanding module, CSPP-IQA achieved better accuracy, generalization and efficiency than traditional IQA methods. The result of experiments running on real-scene IQA datasets in this study verified the effectiveness and efficiency of CSPP-IQA., Competing Interests: Conflict of interestsThe authors declare that they have no interest conflict., (© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022, Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.)
- Published
- 2022
- Full Text
- View/download PDF
7. CMIC: an efficient quality score compressor with random access functionality.
- Author
-
Chen H, Chen J, Lu Z, and Wang R
- Subjects
- Algorithms, Software, Data Compression methods, High-Throughput Nucleotide Sequencing methods
- Abstract
Background: Over the past few decades, the emergence and maturation of new technologies have substantially reduced the cost of genome sequencing. As a result, the amount of genomic data that needs to be stored and transmitted has grown exponentially. For the standard sequencing data format, FASTQ, compression of the quality score is a key and difficult aspect of FASTQ file compression. Throughout the literature, we found that the majority of the current quality score compression methods do not support random access. Based on the above consideration, it is reasonable to investigate a lossless quality score compressor with a high compression rate, a fast compression and decompression speed, and support for random access., Results: In this paper, we propose CMIC, an adaptive and random access supported compressor for lossless compression of quality score sequences. CMIC is an acronym of the four steps (classification, mapping, indexing and compression) in the paper. Its framework consists of the following four parts: classification, mapping, indexing, and compression. The experimental results show that our compressor has good performance in terms of compression rates on all the tested datasets. The file sizes are reduced by up to 21.91% when compared with LCQS. In terms of compression speed, CMIC is better than all other compressors on most of the tested cases. In terms of random access speed, the CMIC is faster than the LCQS, which provides a random access function for compressed quality scores., Conclusions: CMIC is a compressor that is especially designed for quality score sequences, which has good performance in terms of compression rate, compression speed, decompression speed, and random access speed. The CMIC can be obtained in the following way: https://github.com/Humonex/Cmic ., (© 2022. The Author(s).)
- Published
- 2022
- Full Text
- View/download PDF
8. Reliability of Gradient-Echo Magnetic Resonance Elastography of Lumbar Muscles: Phantom and Clinical Studies.
- Author
-
Hsieh TJ, Chou MC, Chen YC, Chou YC, Lin CH, and Chen CK
- Abstract
Magnetic resonance elastography (MRE) has been used to successfully characterize the mechanical behavior of healthy and diseased muscles, but no study has been performed to investigate the reliability of MRE on lumbar muscles. The objective of this work was to determine the reliability of MRE techniques on lumbar muscles in both ex vivo phantom and in vivo human studies. In this study, fresh porcine leg muscles were used in the phantom study, and 80 healthy adults (38.6 ± 11.2 years, 40 women) were recruited in the human study. Five repeated stiffness maps were obtained from both the phantom and human muscles by using a gradient-echo MRE sequence with a pneumatic vibration on a 1.5 T MR scanner. The technical failure rate, coefficient of variation (CV), and quality score were assessed to evaluate the reliability of MRE, respectively. Analysis of variance was performed to compare the stiffness between different lumbar muscles, and the difference was significant if p < 0.05 after Bonferroni correction. The results showed that the MRE achieved a zero technical failure rate and a low CV of stiffness (6.24 ± 1.41%) in the phantom muscles. However, in the human study, the MRE exhibited high CVs of stiffness (21.57%−25.24%) in the lumbar muscles, and the technical failure rate was higher in psoas muscles (60.0−66.3% in) than in paraspinal muscles (0.0−2.5%). Further, higher quality scores were noticed in paraspinal muscles (7.31−7.71) than those in psoas muscles (1.83−2.06). In conclusion, the MRE was a reliable technique to investigate the mechanical property of lumbar muscles, but it was less reliable to assess stiffness in psoas muscles than paraspinal muscles.
- Published
- 2022
- Full Text
- View/download PDF
9. Modeling and optimizing an agro-supply chain considering different quality grades and storage systems for fresh products: a Benders decomposition solution approach.
- Author
-
Keshavarz-Ghorbani F and Pasandideh SHR
- Abstract
This paper proposes a mathematical model in the context of agro-supply chain management, considering specific characteristics of agro-products to assist purchase, storage, and transportation decisions. In addition, a new method for determining the required quality score of different types of products is proposed based on their loss factors and purchasing costs. The model aims to minimize total cost imposed by purchasing fresh products, opening warehouses, holding inventories, operational activities, and transportation. Two sets of examples, including small and medium-sized problems, are implemented by general algebraic modeling language (GAMS) software to evaluate the model. Then, Benders decomposition (BD) algorithm is applied to tackle the complexity of solving large-sized instances. The results of both GAMS and BD are compared in terms of objective function values and computational time to demonstrate the efficiency of the BD algorithm. Finally, the model is applied in a real case study involving an apple supply chain to obtain managerial insights., (© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2021.)
- Published
- 2022
- Full Text
- View/download PDF
10. FCLQC: fast and concurrent lossless quality scores compressor.
- Author
-
Cho M and No A
- Abstract
Background: Advances in sequencing technology have drastically reduced sequencing costs. As a result, the amount of sequencing data increases explosively. Since FASTQ files (standard sequencing data formats) are huge, there is a need for efficient compression of FASTQ files, especially quality scores. Several quality scores compression algorithms are recently proposed, mainly focused on lossy compression to boost the compression rate further. However, for clinical applications and archiving purposes, lossy compression cannot replace lossless compression. One of the main challenges for lossless compression is time complexity, where it takes thousands of seconds to compress a 1 GB file. Also, there are desired features for compression algorithms, such as random access. Therefore, there is a need for a fast lossless compressor with a reasonable compression rate and random access functionality., Results: This paper proposes a Fast and Concurrent Lossless Quality scores Compressor (FCLQC) that supports random access and achieves a lower running time based on concurrent programming. Experimental results reveal that FCLQC is significantly faster than the baseline compressors on compression and decompression at the expense of compression ratio. Compared to LCQS (baseline quality score compression algorithm), FCLQC shows at least 31x compression speed improvement in all settings, where a performance degradation in compression ratio is up to 13.58% (8.26% on average). Compared to general-purpose compressors (such as 7-zip), FCLQC shows 3x faster compression speed while having better compression ratios, at least 2.08% (4.69% on average). Moreover, the speed of random access decompression also outperforms the others. The concurrency of FCLQC is implemented using Rust; the performance gain increases near-linearly with the number of threads., Conclusion: The superiority of compression and decompression speed makes FCLQC a practical lossless quality score compressor candidate for speed-sensitive applications of DNA sequencing data. FCLQC is available at https://github.com/Minhyeok01/FCLQC and is freely available for non-commercial usage., (© 2021. The Author(s).)
- Published
- 2021
- Full Text
- View/download PDF
11. Systematic review and quality evaluation of published human ingestion-time trials of blood pressure-lowering medications and their combinations.
- Author
-
Hermida RC, Hermida-Ayala RG, Mojón A, Smolensky MH, and Fernández JR
- Subjects
- Antihypertensive Agents therapeutic use, Blood Pressure, Circadian Rhythm, Eating, Humans, Prospective Studies, Blood Pressure Monitoring, Ambulatory, Hypertension drug therapy
- Abstract
The pharmacokinetics (PK) - absorption, distribution, metabolism, and elimination - and pharmacodynamics (PD) of hypertension medications can be significantly affected by circadian rhythms. As a consequence, the time when blood pressure (BP) lowering medications are ingested, with reference to the staging of all involved circadian rhythms modulating PK and PD, can affect their duration of action, magnitude of effect on features of the 24 h BP profile, and safety. We conducted a systematic and comprehensive review of published prospective human trials that investigated individual hypertension medications of all classes and their combinations for ingestion-time differences in BP-lowering, safety, patient adherence, and markers of hypertension-associated target organ pathology of the kidney and heart. The systematic review yielded 155 trials published between 1976 and 2020 - totaling 23,972 hypertensive individuals - that evaluated 37 different single and 14 dual-combination therapies. The vast (83.9%) majority of them reported clinically and statistically significant benefits - including enhanced reduction of asleep BP mean without induced sleep-time hypotension, reduced prevalence of the higher cardiovascular risk non-dipper 24 h BP profile, decreased incidence of adverse effects, improved kidney function, and reduced cardiac pathology - when hypertension medications are ingested at-bedtime/evening rather than upon-waking/morning. Nonetheless, the findings and conclusions of some past conducted trials are inconsistent, often due to disparities and deficiencies of the investigative protocols. Accordingly, we developed a quality assessment method based upon the eight items identified as crucial according to the recently published guidelines of the International Society for Chronobiology and the American Association for Medical Chronobiology and Chronotherapeutics for the design and conduct of human clinical trials on ingestion-time differences of hypertension medications. Among the most frequent deficiencies are: absence or miscalculation of minimum required sample size (83.2%), incorrect choice of primary BP endpoint (53.6%), and inappropriate arbitrary and unrepresentative clock hours chosen for tested treatment times (53.6%). The inability of the very small proportion (16.1%) of trials to verify the advantages of the at-bedtime/evening treatment strategy is likely explained by deficiencies of their study design and conduct. Nonetheless, regardless of the quality score of the 155 trials retrieved by our systematic review, it is most noteworthy that no single published prospective randomized trial reported significantly enhanced BP-lowering, safety, compliance, or other benefits of the unjustified by medical evidence, yet still most recommended, upon-waking/morning hypertension treatment-time scheme.
- Published
- 2021
- Full Text
- View/download PDF
12. ngsComposer: an automated pipeline for empirically based NGS data quality filtering.
- Author
-
Kuster RD, Yencho GC, and Olukolu BA
- Subjects
- Computer Simulation, Humans, Reproducibility of Results, Algorithms, Computational Biology methods, High-Throughput Nucleotide Sequencing methods, Sequence Analysis, DNA methods, Software
- Abstract
Next-generation sequencing (NGS) enables massively parallel acquisition of large-scale omics data; however, objective data quality filtering parameters are lacking. Although a useful metric, evidence reveals that platform-generated Phred values overestimate per-base quality scores. We have developed novel and empirically based algorithms that streamline NGS data quality filtering. The pipeline leverages known sequence motifs to enable empirical estimation of error rates, detection of erroneous base calls and removal of contaminating adapter sequence. The performance of motif-based error detection and quality filtering were further validated with read compression rates as an unbiased metric. Elevated error rates at read ends, where known motifs lie, tracked with propagation of erroneous base calls. Barcode swapping, an inherent problem with pooled libraries, was also effectively mitigated. The ngsComposer pipeline is suitable for various NGS protocols and platforms due to the universal concepts on which the algorithms are based., (© The Author(s) 2021. Published by Oxford University Press.)
- Published
- 2021
- Full Text
- View/download PDF
13. Guidelines for Sanger sequencing and molecular assay monitoring.
- Author
-
Crossley BM, Bai J, Glaser A, Maes R, Porter E, Killian ML, Clement T, and Toohey-Kurth K
- Subjects
- Animals, Base Sequence, High-Throughput Nucleotide Sequencing methods, Humans, Laboratories, Phylogeny, Sequence Analysis, DNA veterinary, Animal Diseases diagnosis, High-Throughput Nucleotide Sequencing veterinary, Polymerase Chain Reaction veterinary
- Abstract
Genetic sequencing, or DNA sequencing, using the Sanger technique has become widely used in the veterinary diagnostic community. This technology plays a role in verification of PCR results and is used to provide the genetic sequence data needed for phylogenetic analysis, epidemiologic studies, and forensic investigations. The Laboratory Technology Committee of the American Association of Veterinary Laboratory Diagnosticians has prepared guidelines for sample preparation, submission to sequencing facilities or instrumentation, quality assessment of nucleic acid sequence data performed, and for generating basic sequencing data and phylogenetic analysis for diagnostic applications. This guidance is aimed at assisting laboratories in providing consistent, high-quality, and reliable sequence data when using Sanger-based genetic sequencing as a component of their laboratory services.
- Published
- 2020
- Full Text
- View/download PDF
14. The quality of evidence for medical interventions does not improve or worsen: a metaepidemiological study of Cochrane reviews.
- Author
-
Howick J, Koletsi D, Pandis N, Fleming PS, Loef M, Walach H, Schmidt S, and Ioannidis JPA
- Subjects
- Epidemiologic Studies, Humans, Outcome Assessment, Health Care methods, Outcome Assessment, Health Care statistics & numerical data, Systematic Reviews as Topic, Data Management methods, Outcome Assessment, Health Care standards, Quality Indicators, Health Care statistics & numerical data
- Abstract
Objectives: The objective of the study was to determine the change in quality of evidence in updates of Cochrane reviews that were initially published between January 1, 2013 and June 30, 2014. We used the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) system to document evidence quality., Study Design and Setting: We searched the Cochrane Database of Systematic Reviews on March 20, 2020 to identify which of the reviews from the initial (2013/14) sample had been updated. Using the same methods to determine the quality of evidence in the previous analysis, we assessed the quality of evidence for the first-listed primary outcomes in the updated reviews., Results: Of the 608 reviews in the original sample, 154 had been updated with and 151 contained available data for both original and updated systematic reviews (24.8%). The updated reviews included: 15 (9.9%) with high-quality evidence, 56 (37.1%) with moderate-quality evidence, 47 (31.1%) with low-quality evidence, and 33 (21.9%) with very low-quality evidence. No change in the GRADE quality of evidence was found for most (103, 68.2%) of the updated reviews. The quality of evidence rating was downgraded in 28 reviews (58.3%) and upgraded in 20 (41.7%), although only six reviews were promoted to high quality., Conclusion: Updated systematic reviews continued to suggest that only a minority of outcomes for health care interventions are supported by high-quality evidence. The quality of the evidence did not consistently improve or worsen in updated reviews., (Copyright © 2020 Elsevier Inc. All rights reserved.)
- Published
- 2020
- Full Text
- View/download PDF
15. LCQS: an efficient lossless compression tool of quality scores with random access functionality.
- Author
-
Fu J, Ke B, and Dong S
- Subjects
- Algorithms, Genomics, Sequence Analysis, DNA, Software, Data Compression methods
- Abstract
Background: Advanced sequencing machines dramatically speed up the generation of genomic data, which makes the demand of efficient compression of sequencing data extremely urgent and significant. As the most difficult part of the standard sequencing data format FASTQ, compression of the quality score has become a conundrum in the development of FASTQ compression. Existing lossless compressors of quality scores mainly utilize specific patterns generated by specific sequencer and complex context modeling techniques to solve the problem of low compression ratio. However, the main drawbacks of these compressors are the problem of weak robustness which means unstable or even unavailable results of sequencing files and the problem of slow compression speed. Meanwhile, some compressors attempt to construct a fine-grained index structure to solve the problem of slow random access decompression speed. However, they solve the problem at the sacrifice of compression speed and at the expense of large index files, which makes them inefficient and impractical. Therefore, an efficient lossless compressor of quality scores with strong robustness, high compression ratio, fast compression and random access decompression speed is urgently needed and of great significance., Results: In this paper, based on the idea of maximizing the use of hardware resources, LCQS, a lossless compression tool specialized for quality scores, was proposed. It consists of four sequential processing steps: partitioning, indexing, packing and parallelizing. Experimental results reveal that LCQS outperforms all the other state-of-the-art compressors on all criteria except for the compression speed on the dataset SRR1284073. Furthermore, LCQS presents strong robustness on all the test datasets, with its acceleration ratios of compression speed increasing by up to 29.1x, its file size reducing by up to 28.78%, and its random access decompression speed increasing by up to 2.1x. Additionally, LCQS also exhibits strong scalability. That is, the compression speed increases almost linearly as the size of input dataset increases., Conclusion: The ability to handle all different kinds of quality scores and superiority in compression ratio and compression speed make LCQS a high-efficient and advanced lossless quality score compressor, along with its strength of fast random access decompression. Our tool LCQS can be downloaded from https://github.com/SCUT-CCNL/LCQSand freely available for non-commercial usage.
- Published
- 2020
- Full Text
- View/download PDF
16. Comparison of bias adjustment methods in meta-analysis suggests that quality effects modeling may have less limitations than other approaches.
- Author
-
Stone JC, Glass K, Munn Z, Tugwell P, and Doi SAR
- Subjects
- Bias, Humans, Models, Theoretical, Meta-Analysis as Topic
- Abstract
Background: The quality of primary research is commonly assessed before inclusion in meta-analyses. Findings are discussed in the context of the quality appraisal by categorizing studies according to risk of bias. The impact of appraised risk of bias on study outcomes is typically judged by the reader; however, several methods have been developed to quantify this risk of bias assessment and incorporate it into the pooled results of meta-analysis, a process known as bias adjustment. The advantages, potential limitations, and applicability of these methods are not well defined., Study Design and Setting: Comparative evaluation of the applicability of the various methods and their limitations are discussed using two examples from the literature. These methods include weighting, stratification, regression, use of empirically based prior distributions, and elicitation by experts., Results: Use of the two examples from the literature suggest that all methods provide similar adjustment. Methods differed mainly in applicability and limitations., Conclusion: Bias adjustment is a feasible process in meta-analysis with several strategies currently available. Quality effects modelling was found to be easily implementable with fewer limitations in comparison to other methods., (Copyright © 2019 Elsevier Inc. All rights reserved.)
- Published
- 2020
- Full Text
- View/download PDF
17. Indexing k -mers in linear space for quality value compression.
- Author
-
Shibuya Y and Comin M
- Subjects
- Algorithms, Databases, Genetic, Genome, Human, Genotyping Techniques, High-Throughput Nucleotide Sequencing, Humans, Polymorphism, Single Nucleotide, Data Compression methods, Genomics methods, Software
- Abstract
Many bioinformatics tools heavily rely on k -mer dictionaries to describe the composition of sequences and allow for faster reference-free algorithms or look-ups. Unfortunately, naive k -mer dictionaries are very memory-inefficient, requiring very large amount of storage space to save each k -mer. This problem is generally worsened by the necessity of an index for fast queries. In this work, we discuss how to build an indexed linear reference containing a set of input k -mers and its application to the compression of quality scores in FASTQ files. Most of the entropies of sequencing data lie in the quality scores, and thus they are difficult to compress. Here, we present an application to improve the compressibility of quality values while preserving the information for SNP calling. We show how a dictionary of significant k -mers, obtained from SNP databases and multiple genomes, can be indexed in linear space and used to improve the compression of quality value. Availability: The software is freely available at https://github.com/yhhshb/yalff.
- Published
- 2019
- Full Text
- View/download PDF
18. Stratification by quality induced selection bias in a meta-analysis of clinical trials.
- Author
-
Stone J, Gurunathan U, Glass K, Munn Z, Tugwell P, and Doi SAR
- Subjects
- Humans, Meta-Analysis as Topic, Selection Bias, Research Design standards
- Abstract
Objectives: The inconsistency demonstrated across strata when using different scales has been attributed to quality scores, and stratification continues to be done using risk of bias domain judgments. This study examines if restricting primary meta-analyses to studies at low risk of bias or presenting meta-analyses stratified according to risk of bias is indeed the right approach to explore potential methodological bias., Study Design and Setting: Reanalysis of the impact of quality subgroupings in an existing meta-analysis based on 25 different scales., Results: We demonstrate that quality stratification itself is the problem because it induces a spurious association between effect size and precision within stratum. Studies with larger effects or lesser precision tend to be of lower quality-a form of collider-stratification bias (stratum being the common effect of the reasons for these two outcomes) that leads to inconsistent results across scales. We also show that the extent of this association determines the variability in effect size and statistical significance across strata when conditioning on quality., Conclusions: We conclude that stratification by quality leads to a form of selection bias (collider-stratification bias) and should be avoided. We demonstrate consistent results with an alternative method that includes all studies., (Copyright © 2018 Elsevier Inc. All rights reserved.)
- Published
- 2019
- Full Text
- View/download PDF
19. A model-guided method for improving coronary artery tree extractions from CCTA images.
- Author
-
Cao Q, Broersen A, Kitslaar PH, Lelieveldt BPF, and Dijkstra J
- Subjects
- Algorithms, Coronary Vessels anatomy & histology, Humans, Models, Anatomic, Coronary Angiography methods, Coronary Artery Disease diagnostic imaging, Coronary Vessels diagnostic imaging, Image Processing, Computer-Assisted, Tomography, X-Ray Computed methods
- Abstract
Purpose: Automatically extracted coronary artery trees (CATs) from coronary computed tomography angiography images could contain incorrect extractions which require manual corrections before they can be used in clinical practice. A model-guided method for improving the extracted CAT is described to automatically detect potential incorrect extractions and improve them., Methods: The proposed method is a coarse-to-fine approach. A coarse improvement is first applied on all vessels in the extracted CAT, and then a fine improvement is applied only on vessels with higher clinical significance. Based upon a decision tree, the proposed method automatically and iteratively performs improvement operations for the entire extracted CAT until it meets the stop criteria. The improvement in the extraction quality obtained by the proposed method is measured using a scoring system. 18 datasets were used to determine optimal values for the parameters involved in the model-guided method and 122 datasets were used for evaluation., Results: Compared to the initial automatic extractions, the proposed method improves the CATs for 122 datasets from an average quality score of 87 ± 6 to 93 ± 4. The developed method is able to run within 2 min on a typical workstation. The difference in extraction quality after automatic improvement is negatively correlated with the initial extraction quality (R = - 0.694, P < 0.001)., Conclusion: Without deteriorating the initially extracted CATs, the presented method automatically detects incorrect extractions and improves the CATs to an average quality score of 93 guided by anatomical statistical models.
- Published
- 2019
- Full Text
- View/download PDF
20. Estimating Phred scores of Illumina base calls by logistic regression and sparse modeling.
- Author
-
Zhang S, Wang B, Wan L, and Li LM
- Subjects
- Logistic Models, High-Throughput Nucleotide Sequencing methods, Sequence Analysis, DNA methods, Software
- Abstract
Background: Phred quality scores are essential for downstream DNA analysis such as SNP detection and DNA assembly. Thus a valid model to define them is indispensable for any base-calling software. Recently, we developed the base-caller 3Dec for Illumina sequencing platforms, which reduces base-calling errors by 44-69% compared to the existing ones. However, the model to predict its quality scores has not been fully investigated yet., Results: In this study, we used logistic regression models to evaluate quality scores from predictive features, which include different aspects of the sequencing signals as well as local DNA contents. Sparse models were further obtained by three methods: the backward deletion with either AIC or BIC and the L
1 regularization learning method. The L1 -regularized one was then compared with the Illumina scoring method., Conclusions: The L1 -regularized logistic regression improves the empirical discrimination power by as large as 14 and 25% respectively for two kinds of preprocessed sequencing signals, compared to the Illumina scoring method. Namely, the L1 method identifies more base calls of high fidelity. Computationally, the L1 method can handle large dataset and is efficient enough for daily sequencing. Meanwhile, the logistic model resulted from BIC is more interpretable. The modeling suggested that the most prominent quenching pattern in the current chemistry of Illumina occurred at the dinucleotide "GT". Besides, nucleotides were more likely to be miscalled as the previous bases if the preceding ones were not "G". It suggested that the phasing effect of bases after "G" was somewhat different from those after other nucleotide types.- Published
- 2017
- Full Text
- View/download PDF
21. Impact of the angle used in 2D Ultrasonography on the foetal femur diaphysis measurement.
- Author
-
Lincé A, Capelle X, Lepage S, Kridelka F, and Van Linthout C
- Abstract
Objective: The purpose of this pilot study is to compare the 2D scanning measurement of the foetal femoral diaphysis using anterior or lateral/external incidence at ultrasound., Methods: In August 2016, 30 consecutive patients underwent a second trimester morphology ultrasound between 21 and 24 weeks of gestation by a senior sonographist. In each case, the femur length was measured either with an anterior angle, estimating the straight aspect of the diaphysis or with a lateral angle, assessing its curved aspect. The two measures were collected prospectively. The difference between paired measurements was calculated and expressed in percentage (mm) and in percentile., Results: The median difference between the two ultrasound angles in terms of femur length was 3,55% and in terms of percentile variation was 17,16., Conclusion: An anterior angle of measurement of the femur length seems to allow an optimal measure of the straight and longest aspect of the diaphysis. According to our results, this angle should be considered when scoring the quality of a morphological ultrasound, but further and larger studies should be done to confirm our hypothesis., Competing Interests: Declaration: The author report no financial or commercial conflicts of interest.
- Published
- 2017
22. [Quality management in a clinical research facility: Evaluation of changes in quality in-house figures and the appraisal of in-house quality indicators].
- Author
-
Aden B, Allekotte S, and Mösges R
- Subjects
- Germany, Humans, Prospective Studies, Retrospective Studies, Quality Assurance, Health Care, Quality Indicators, Health Care
- Abstract
For long-term maintenance and improvement of quality within a clinical research institute, the implementation and certification of a quality management system is suitable. Due to the implemented quality management system according to the still valid DIN EN ISO 9001:2008 desired quality objectives are achieved effectively. The evaluation of quality scores and the appraisal of in-house quality indicators make an important contribution in this regard. In order to achieve this and draw quality assurance conclusions, quality indicators as sensible and sensitive as possible are developed. For this, own key objectives, the retrospective evaluation of quality scores, a prospective follow-up and also discussions establish the basis. In the in-house clinical research institute the measures introduced by the quality management led to higher efficiency in work processes, improved staff skills, higher customer satisfaction and overall to more successful outcomes in relation to the self-defined key objectives., (Copyright © 2016. Published by Elsevier GmbH.)
- Published
- 2016
- Full Text
- View/download PDF
23. Effect of carcass fat and conformation class on consumer perception of various grilled beef muscles.
- Author
-
Guzek D, Głąbska D, Gutkowska K, and Wierzbicka A
- Abstract
The aim of the study was to analyse the attributes influencing consumer perception of grilled beef steaks. The objects were 30 carcasses out of which eight cuts were obtained (2100 single samples were prepared). A total of 350 consumers were asked to rate the meat samples (6 samples for each consumer) by assessing: tenderness, juiciness, flavour, overall acceptability and satisfaction. The MQ4, which is a combination of consumer rates for tenderness, juiciness, flavour and overall acceptability that is transformed into a single parameter with greater discriminatory ability, was calculated using linear discriminate analysis. The tenderloin was the cut that had the highest ratings for all attributes, however, tenderness, juiciness, MQ4 and consumer satisfaction evaluated for oyster blade were not significantly different from tenderloin. The results of this study indicated that consumer preferences regarding grilled steak were not influenced by fat class, conformation rib fat thickness and ossification score of the carcasses but only by the type of meat cuts., Competing Interests: The authors declare that there are no conflicts of interest. Ethics standard All cattle were slaughtered in the main commercial slaughterhouse in Poland according European law. Informed consent All participants provide their written informed consent to participate in this study.
- Published
- 2016
- Full Text
- View/download PDF
24. Analysis of the factors creating consumer attributes of roasted beef steaks.
- Author
-
Guzek D, Głąbska D, Gutkowska K, Wierzbicki J, Woźniak A, and Wierzbicka A
- Subjects
- Animals, Cattle, Fats analysis, Humans, Consumer Behavior statistics & numerical data, Cooking methods, Food Quality, Meat analysis, Meat classification
- Abstract
The aim of the study was to analyze the factors creating consumer attributes of roasted beef steaks of various animals. Eight cuts from 30 carcasses (characterized by various types of animal, conformation and fat class, rib fat thickness, ossification score) were selected. Samples were prepared using the roasting method and consumers rated the tenderness, juiciness, flavor, overall acceptability (rated in a 100-point scale), and satisfaction (rated from 2 to 5) for analyzed samples. No influence of type of animal, fat class, conformation class or ossification score on the results of consumer analysis was observed. For all analyzed factors, the influence of cut on consumer analysis was observed (the highest values of all consumer attributes were observed for tenderloin - for juiciness significantly higher than for other cuts, for tenderness, flavor and MQ4 comparable only with rump (RMP231), while for overall acceptability and satisfaction - with both rump cuts). For rib fat thickness consumer attributes of roasted beef meat were not linear, but the influence was observed - the highest values of consumer attributes were observed for 13 mm rib fat thickness., (© 2014 Japanese Society of Animal Science.)
- Published
- 2015
- Full Text
- View/download PDF
25. DRISEE overestimates errors in metagenomic sequencing data.
- Author
-
Eren AM, Morrison HG, Huse SM, and Sogin ML
- Subjects
- Base Sequence, DNA genetics, Polymerase Chain Reaction, Metagenomics, Sequence Analysis, DNA
- Abstract
The extremely high error rates reported by Keegan et al. in 'A platform-independent method for detecting errors in metagenomic sequencing data: DRISEE' (PLoS Comput Biol 2012; 8: :e1002541) for many next-generation sequencing datasets prompted us to re-examine their results. Our analysis reveals that the presence of conserved artificial sequences, e.g. Illumina adapters, and other naturally occurring sequence motifs accounts for most of the reported errors. We conclude that DRISEE reports inflated levels of sequencing error, particularly for Illumina data. Tools offered for evaluating large datasets need scrupulous review before they are implemented., (© The Author 2013. Published by Oxford University Press.)
- Published
- 2014
- Full Text
- View/download PDF
26. Traversing the k -mer Landscape of NGS Read Datasets for Quality Score Sparsification.
- Author
-
Yu YW, Yorukoglu D, and Berger B
- Abstract
It is becoming increasingly impractical to indefinitely store raw sequencing data for later processing in an uncompressed state. In this paper, we describe a scalable compressive framework, Read-Quality-Sparsifier (RQS), which substantially outperforms the compression ratio and speed of other de novo quality score compression methods while maintaining SNP-calling accuracy. Surprisingly, RQS also improves the SNP-calling accuracy on a gold-standard, real-life sequencing dataset (NA12878) using a k -mer density profile constructed from 77 other individuals from the 1000 Genomes Project. This improvement in downstream accuracy emerges from the observation that quality score values within NGS datasets are inherently encoded in the k -mer landscape of the genomic sequences. To our knowledge, RQS is the first scalable sequence based quality compression method that can efficiently compress quality scores of terabyte-sized and larger sequencing datasets., Availability: An implementation of our method, RQS, is available for download at: http://rqs.csail.mit.edu/.
- Published
- 2014
- Full Text
- View/download PDF
27. A glance at quality score: implication for de novo transcriptome reconstruction of Illumina reads.
- Author
-
Mbandi SK, Hesse U, Rees DJ, and Christoffels A
- Abstract
Downstream analyses of short-reads from next-generation sequencing platforms are often preceded by a pre-processing step that removes uncalled and wrongly called bases. Standard approaches rely on their associated base quality scores to retain the read or a portion of it when the score is above a predefined threshold. It is difficult to differentiate sequencing error from biological variation without a reference using quality scores. The effects of quality score based trimming have not been systematically studied in de novo transcriptome assembly. Using RNA-Seq data produced from Illumina, we teased out the effects of quality score based filtering or trimming on de novo transcriptome reconstruction. We showed that assemblies produced from reads subjected to different quality score thresholds contain truncated and missing transfrags when compared to those from untrimmed reads. Our data supports the fact that de novo assembling of untrimmed data is challenging for de Bruijn graph assemblers. However, our results indicates that comparing the assemblies from untrimmed and trimmed read subsets can suggest appropriate filtering parameters and enable selection of the optimum de novo transcriptome assembly in non-model organisms.
- Published
- 2014
- Full Text
- View/download PDF
28. PRINTQUAL - a measure for assessing the quality of newspaper reporting of suicide.
- Author
-
John A, Hawton K, Lloyd K, Luce A, Platt S, Scourfield J, Marchant AL, Jones PA, and Dennis MS
- Subjects
- Humans, United Kingdom, Newspapers as Topic standards, Suicide psychology, Suicide statistics & numerical data
- Abstract
Background: Many studies have demonstrated a relationship between newspaper reporting of actual or fictional suicides and subsequent suicidal behaviors. Previous measures of the quality of reporting lack consistency concerning which specific elements should be included and how they should be weighted., Aims: To develop an instrument, PRINTQUAL, comprising two scales of the quality (poor and good) of newspaper reporting of suicide that can be used in future studies of reporting., Method: A first draft of the PRINTQUAL instrument was compiled, comprising items indicative of poor- and good-quality newspaper reporting based on guidelines and key sources of evidence. This was refined by team members and then circulated to a group of international experts in the field for further opinion and weighting of individual items., Results: The final instrument comprised 19 items in the poor-quality scale and four in the good-quality scale. Following training, agreement between raters was acceptably high for most items (κ ≥ .75) except for three items for which agreement was still acceptable (κ ≥ .60)., Conclusion: The PRINTQUAL instrument for assessing the quality of newspaper reporting of suicide appears appropriate for use in research and monitoring in future studies.
- Published
- 2014
- Full Text
- View/download PDF
29. Probabilistic model based error correction in a set of various mutant sequences analyzed by next-generation sequencing.
- Author
-
Aita T, Ichihashi N, and Yomo T
- Subjects
- Mutation, Algorithms, DNA genetics, Genetic Variation genetics, High-Throughput Nucleotide Sequencing, Models, Statistical, Sequence Analysis, DNA
- Abstract
To analyze the evolutionary dynamics of a mutant population in an evolutionary experiment, it is necessary to sequence a vast number of mutants by high-throughput (next-generation) sequencing technologies, which enable rapid and parallel analysis of multikilobase sequences. However, the observed sequences include many errors of base call. Therefore, if next-generation sequencing is applied to analysis of a heterogeneous population of various mutant sequences, it is necessary to discriminate between true bases as point mutations and errors of base call in the observed sequences, and to subject the sequences to error-correction processes. To address this issue, we have developed a novel method of error correction based on the Potts model and a maximum a posteriori probability (MAP) estimate of its parameters corresponding to the "true sequences". Our method of error correction utilizes (1) the "quality scores" which are assigned to individual bases in the observed sequences and (2) the neighborhood relationship among the observed sequences mapped in sequence space. The computer experiments of error correction of artificially generated sequences supported the effectiveness of our method, showing that 50-90% of errors were removed. Interestingly, this method is analogous to a probabilistic model based method of image restoration developed in the field of information engineering., (Copyright © 2013 Elsevier Ltd. All rights reserved.)
- Published
- 2013
- Full Text
- View/download PDF
30. The pterional and suprabrow approaches for aneurysm surgery: a systematic review of intraoperative rupture rates in 9488 aneurysms.
- Author
-
Madhugiri VS, Ambekar S, Pandey P, Guthikonda B, Bollam P, Brown B, Ahmed O, Sonig A, Sharma M, and Nanda A
- Subjects
- Aneurysm, Ruptured therapy, Craniotomy methods, Data Interpretation, Statistical, Endovascular Procedures adverse effects, Humans, Intraoperative Complications therapy, Middle Cerebral Artery surgery, Minimally Invasive Surgical Procedures, Neurosurgical Procedures adverse effects, Patient Safety, Skull anatomy & histology, Treatment Outcome, Aneurysm, Ruptured epidemiology, Endovascular Procedures methods, Intracranial Aneurysm surgery, Intraoperative Complications epidemiology, Neurosurgical Procedures methods
- Abstract
Objective: To assess the safety of the suprabrow approach (SBCA) for aneurysm surgery by comparing intraoperative rupture rates with those for the standard pterional approach., Methods: A systematic review of all literature published in or after 1997 was performed using specified search words. All articles described aneurysm surgery by one of two approaches--pterional or suprabrow--and mentioned the rate of intraoperative rupture. A total of 41 articles were found fit for inclusion for the final analysis. Articles that focused on giant, bilateral, posterior fossa, or previously coiled aneurysms were not included. The χ(2) test was used to compare the two cohorts and various subgroup analyses were carried out. A P value of <0.05 was considered significant., Results: The search of literature yielded 9488 aneurysm reports (41 articles), 7535 operated by the pterional approach and 1953 aneurysms by the SBCA. The overall intraoperative rupture (IOR) rate for the entire group was 9.20%. In the pterional craniotomy approach (PtCA) group, the rate of IOR was 10.09% and in the SBCA group, IOR occurred in 5.78%. The IOR rate in the PtCA group was almost double that in the SBCA group and the odds ratio (OR) for this difference was 1.8 (95% confidence interval [CI] 1.49-2.26; P< 0.001). A total of 3039 ruptured aneurysms were analyzed--2848 aneurysms in the PtCA group and 191 in the SBCA group. The rate of IOR was 14.15% for the overall group, 13.8% in the PtCA group, and 19.37% in the SBCA group. The difference in IOR between the PtCA group and the SBCA group for ruptured aneurysms was found to be significant (OR 1.5, 95% CI 1.003-2.119; P< 0.05). The number of unruptured aneurysms in the PtCA group was 862 (39.4%) and in the SBCA group, it was 232 (49.1%). The difference in the number of unruptured aneurysms between the groups was significant (P< 0.001). The rate of IOR was significantly less with the SBCA than with the pterional approach., Conclusions: The rate of intraoperative rupture is significantly higher when ruptured aneurysms are operated with the SBCA (in comparison to the pterional approach). However, the SBCA may be safer for unruptured and middle cerebral artery aneurysms with a lower rate of IOR., (Copyright © 2013 Elsevier Inc. All rights reserved.)
- Published
- 2013
- Full Text
- View/download PDF
31. Long-term outcomes of artificial bowel sphincter for fecal incontinence: a systematic review and meta-analysis.
- Author
-
Hong KD, Dasilva G, Kalaskar SN, Chong Y, and Wexner SD
- Subjects
- Fecal Incontinence diagnosis, Fecal Incontinence etiology, Humans, Time Factors, Treatment Outcome, Anal Canal, Artificial Organs, Fecal Incontinence surgery
- Published
- 2013
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.