894 results
Search Results
2. A modeling study of budding yeast colony formation and its relationship to budding pattern and aging.
- Author
-
Wang, Yanli, Lo, Wing-Cheong, and Chou, Ching-Shan
- Subjects
YEAST fungi genetics ,BUDDING (Zoology) ,ELECTRIC properties of cells ,HAPLOIDY ,DIPLOIDY - Abstract
Budding yeast, which undergoes polarized growth during budding and mating, has been a useful model system to study cell polarization. Bud sites are selected differently in haploid and diploid yeast cells: haploid cells bud in an axial manner, while diploid cells bud in a bipolar manner. While previous studies have been focused on the molecular details of the bud site selection and polarity establishment, not much is known about how different budding patterns give rise to different functions at the population level. In this paper, we develop a two-dimensional agent-based model to study budding yeast colonies with cell-type specific biological processes, such as budding, mating, mating type switch, consumption of nutrients, and cell death. The model demonstrates that the axial budding pattern enhances mating probability at an early stage and the bipolar budding pattern improves colony development under nutrient limitation. Our results suggest that the frequency of mating type switch might control the trade-off between diploidization and inbreeding. The effect of cellular aging is also studied through our model. Based on the simulations, colonies initiated by an aged haploid cell show declined mating probability at an early stage and recover as the rejuvenated offsprings become the majority. Colonies initiated with aged diploid cells do not show disadvantage in colony expansion possibly due to the fact that young cells contribute the most to colony expansion. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
3. Ten simple rules to consider regarding preprint submission.
- Author
-
Bourne, Philip E., Polka, Jessica K., Vale, Ronald D., and Kiley, Robert
- Subjects
PREPRINTS ,DATA mining ,LICENSES ,COPYRIGHT - Abstract
The article discusses rules for considering regarding preprint submission of scientific work in journals. Topics include analysis of published papers undertaken by cell biologist Stephen Royle for estimating the average time from first submission to publication, data mining of the written content for making it better utilizing the knowledge, and encouraging authors to licenses and formats for facilitating reuse while retaining copyright to their work.
- Published
- 2017
- Full Text
- View/download PDF
4. Personalized glucose forecasting for type 2 diabetes using data assimilation.
- Author
-
Albers, David J., Levine, Matthew, Gluckman, Bruce, Ginsberg, Henry, Hripcsak, George, and Mamykina, Lena
- Subjects
BLOOD sugar monitoring ,TYPE 2 diabetes ,QUALITY of life ,GLYCEMIC control ,BAYESIAN analysis ,GAUSSIAN processes - Abstract
Type 2 diabetes leads to premature death and reduced quality of life for 8% of Americans. Nutrition management is critical to maintaining glycemic control, yet it is difficult to achieve due to the high individual differences in glycemic response to nutrition. Anticipating glycemic impact of different meals can be challenging not only for individuals with diabetes, but also for expert diabetes educators. Personalized computational models that can accurately forecast an impact of a given meal on an individual’s blood glucose levels can serve as the engine for a new generation of decision support tools for individuals with diabetes. However, to be useful in practice, these computational engines need to generate accurate forecasts based on limited datasets consistent with typical self-monitoring practices of individuals with type 2 diabetes. This paper uses three forecasting machines: (i) data assimilation, a technique borrowed from atmospheric physics and engineering that uses Bayesian modeling to infuse data with human knowledge represented in a mechanistic model, to generate real-time, personalized, adaptable glucose forecasts; (ii) model averaging of data assimilation output; and (iii) dynamical Gaussian process model regression. The proposed data assimilation machine, the primary focus of the paper, uses a modified dual unscented Kalman filter to estimate states and parameters, personalizing the mechanistic models. Model selection is used to make a personalized model selection for the individual and their measurement characteristics. The data assimilation forecasts are empirically evaluated against actual postprandial glucose measurements captured by individuals with type 2 diabetes, and against predictions generated by experienced diabetes educators after reviewing a set of historical nutritional records and glucose measurements for the same individual. The evaluation suggests that the data assimilation forecasts compare well with specific glucose measurements and match or exceed in accuracy expert forecasts. We conclude by examining ways to present predictions as forecast-derived range quantities and evaluate the comparative advantages of these ranges. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
5. A case study in the functional consequences of scaling the sizes of realistic cortical models.
- Author
-
Joglekar, Madhura R., Chariker, Logan, Shapley, Robert, and Young, Lai-Sang
- Subjects
COMPUTATIONAL neuroscience ,VISUAL cortex ,ACTION potentials ,HUMAN behavior models ,CASE studies ,CEREBRAL cortex - Abstract
Neuroscience models come in a wide range of scales and specificity, from mean-field rate models to large-scale networks of spiking neurons. There are potential trade-offs between simplicity and realism, versatility and computational speed. This paper is about large-scale cortical network models, and the question we address is one of scalability: would scaling down cell density impact a network’s ability to reproduce cortical dynamics and function? We investigated this problem using a previously constructed realistic model of the monkey visual cortex that is true to size. Reducing cell density gradually up to 50-fold, we studied changes in model behavior. Size reduction without parameter adjustment was catastrophic. Surprisingly, relatively minor compensation in synaptic weights guided by a theoretical algorithm restored mean firing rates and basic function such as orientation selectivity to models 10-20 times smaller than the real cortex. Not all was normal in the reduced model cortices: intracellular dynamics acquired a character different from that of real neurons, and while the ability to relay feedforward inputs remained intact, reduced models showed signs of deficiency in functions that required dynamical interaction among cortical neurons. These findings are not confined to models of the visual cortex, and modelers should be aware of potential issues that accompany size reduction. Broader implications of this study include the importance of homeostatic maintenance of firing rates, and the functional consequences of feedforward versus recurrent dynamics, ideas that may shed light on other species and on systems suffering cell loss. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
6. A quick guide for using Microsoft OneNote as an electronic laboratory notebook.
- Author
-
Guerrero, Santiago, López-Cortés, Andrés, García-Cárdenas, Jennyfer M., Saa, Pablo, Indacochea, Alberto, Armendáriz-Castillo, Isaac, Zambrano, Ana Karina, Yumiceba, Verónica, Pérez-Villa, Andy, Guevara-Ramírez, Patricia, Moscoso-Zea, Oswaldo, Paredes, Joel, Leone, Paola E., and Paz-y-Miño, César
- Subjects
DATA recorders & recording ,MEDICAL research ,CLINICAL trials ,WORKFLOW ,RESEARCH institutes - Abstract
Scientific data recording and reporting systems are of a great interest for endorsing reproducibility and transparency practices among the scientific community. Current research generates large datasets that can no longer be documented using paper lab notebooks (PLNs). In this regard, electronic laboratory notebooks (ELNs) could be a promising solution to replace PLNs and promote scientific reproducibility and transparency. We previously analyzed five ELNs and performed two survey-based studies to implement an ELN in a biomedical research institute. Among the ELNs tested, we found that Microsoft OneNote presents numerous features related to ELN best functionalities. In addition, both surveyed groups preferred OneNote over a scientifically designed ELN (PerkinElmer Elements). However, OneNote remains a general note-taking application and has not been designed for scientific purposes. We therefore provide a quick guide to adapt OneNote to an ELN workflow that can also be adjusted to other nonscientific ELNs. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
7. Even a good influenza forecasting model can benefit from internet-based nowcasts, but those benefits are limited.
- Author
-
Osthus, Dave, Daughton, Ashlynn R., and Priedhorsky, Reid
- Subjects
INFLUENZA ,RESPIRATORY infections ,PUBLIC health ,MATHEMATICAL models of forecasting - Abstract
The ability to produce timely and accurate flu forecasts in the United States can significantly impact public health. Augmenting forecasts with internet data has shown promise for improving forecast accuracy and timeliness in controlled settings, but results in practice are less convincing, as models augmented with internet data have not consistently outperformed models without internet data. In this paper, we perform a controlled experiment, taking into account data backfill, to improve clarity on the benefits and limitations of augmenting an already good flu forecasting model with internet-based nowcasts. Our results show that a good flu forecasting model can benefit from the augmentation of internet-based nowcasts in practice for all considered public health-relevant forecasting targets. The degree of forecast improvement due to nowcasting, however, is uneven across forecasting targets, with short-term forecasting targets seeing the largest improvements and seasonal targets such as the peak timing and intensity seeing relatively marginal improvements. The uneven forecasting improvements across targets hold even when “perfect” nowcasts are used. These findings suggest that further improvements to flu forecasting, particularly seasonal targets, will need to derive from other, non-nowcasting approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
8. In silico analysis of antibiotic-induced Clostridium difficile infection: Remediation techniques and biological adaptations.
- Author
-
Jones, Eric W. and Carlson, Jean M.
- Subjects
CLOSTRIDIOIDES difficile ,LABORATORY mice ,INFECTION ,TOXINS ,DIAGNOSIS ,THERAPEUTICS - Abstract
In this paper we study antibiotic-induced C. difficile infection (CDI), caused by the toxin-producing C. difficile (CD), and implement clinically-inspired simulated treatments in a computational framework that synthesizes a generalized Lotka-Volterra (gLV) model with SIR modeling techniques. The gLV model uses parameters derived from an experimental mouse model, in which the mice are administered antibiotics and subsequently dosed with CD. We numerically identify which of the experimentally measured initial conditions are vulnerable to CD colonization, then formalize the notion of CD susceptibility analytically. We simulate fecal transplantation, a clinically successful treatment for CDI, and discover that both the transplant timing and transplant donor are relevant to the the efficacy of the treatment, a result which has clinical implications. We incorporate two nongeneric yet dangerous attributes of CD into the gLV model, sporulation and antibiotic-resistant mutation, and for each identify relevant SIR techniques that describe the desired attribute. Finally, we rely on the results of our framework to analyze an experimental study of fecal transplants in mice, and are able to explain observed experimental results, validate our simulated results, and suggest model-motivated experiments. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
9. Forecasting Human African Trypanosomiasis Prevalences from Population Screening Data Using Continuous Time Models.
- Author
-
De Vries, Harwin, Wagelmans, Albert P. M., Hasker, Epco, Lumbala, Crispin, Lutumba, Pascal, De Vlas, Sake J., and Klundert, Joris Van De
- Subjects
AFRICAN trypanosomiasis ,MEDICAL screening ,DISEASE prevalence ,DISEASE progression ,EPIDEMICS ,DIAGNOSIS - Abstract
To eliminate and eradicate gambiense human African trypanosomiasis (HAT), maximizing the effectiveness of active case finding is of key importance. The progression of the epidemic is largely influenced by the planning of these operations. This paper introduces and analyzes five models for predicting HAT prevalence in a given village based on past observed prevalence levels and past screening activities in that village. Based on the quality of prevalence level predictions in 143 villages in Kwamouth (DRC), and based on the theoretical foundation underlying the models, we consider variants of the Logistic Model—a model inspired by the SIS epidemic model—to be most suitable for predicting HAT prevalence levels. Furthermore, we demonstrate the applicability of this model to predict the effects of planning policies for screening operations. Our analysis yields an analytical expression for the screening frequency required to reach eradication (zero prevalence) and a simple approach for determining the frequency required to reach elimination within a given time frame (one case per 10000). Furthermore, the model predictions suggest that annual screening is only expected to lead to eradication if at least half of the cases are detected during the screening rounds. This paper extends knowledge on control strategies for HAT and serves as a basis for further modeling and optimization studies. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
10. A simulation of the random and directed motion of dendritic cells in chemokine fields.
- Author
-
Parr, Avery, Anderson, Nicholas R., and Hammer, Daniel A.
- Subjects
DENDRITIC cells ,CHEMOTAXIS ,CHEMOKINE receptors ,CELL receptors ,ANTIGEN presenting cells ,T cells ,MOTION - Abstract
Dendritic cells (DCs) are the most effective professional antigen-presenting cell. They ferry antigen from the extremities to T cells and are essential for the initiation of an adaptive immune response. Despite interest in how DCs respond to chemical stimuli, there have been few attempts to model DC migration. In this paper, we simulate the motility of DCs by modeling the generation of forces by filopodia and a force balance on the cell. The direction of fliopodial extension is coupled to differential occupancy of cognate chemokine receptors across the cell. Our model simulates chemokinesis and chemotaxis in a variety of chemical and mechanical environments. Simulated DCs undergoing chemokinesis were measured to have a speed of 5.1 ± 0.07 μm·min
-1 and a persistence time of 3.2 ± 0.46 min, consistent with experiment. Cells undergoing chemotaxis exhibited a stronger chemotactic response when exposed to lower average chemokine concentrations, also consistent with experiment. We predicted that when placed in two opposing gradients, cells will cluster in a line, which we call the “line of equistimulation;” this clustering has also been observed. We calculated the effect of varying gradient steepness on the line of equistimulation, with steeper gradients resulting in tighter clustering. Moreover, gradients are found to be most potent when cells are in a gradient of chemokine whose mean concentration is close to the binding of the Kd to the receptor, and least potent when the mean concentration is 0.1Kd . Comparing our simulations to experiment, we can give a quantitative measure of the strength of certain chemokines relative to others. Assigning the signal of CCL19 binding CCR7 a baseline strength of 1, we found CCL21 binding CCR7 had a strength of 0.28, and CXCL12 binding CXCR4 had a strength of 0.30. These differences emerge despite both chemokines having virtually the same Kd , suggesting a mechanism of signal amplification in DCs requiring further study. [ABSTRACT FROM AUTHOR]- Published
- 2019
- Full Text
- View/download PDF
11. Ten simple rules to create biological network figures for communication.
- Author
-
Marai, G. Elisabeta, Pinaud, Bruno, Bühler, Katja, Lex, Alexander, and Morris, John H.
- Subjects
TELECOMMUNICATION systems ,BIOLOGICAL networks ,MEDICAL literature ,PHYSICAL sciences ,REFERENCE sources ,BIOLOGY - Abstract
Biological network figures are ubiquitous in the biology and medical literature. On the one hand, a good network figure can quickly provide information about the nature and degree of interactions between items and enable inferences about the reason for those interactions. On the other hand, good network figures are difficult to create. In this paper, we outline 10 simple rules for creating biological network figures for communication, from choosing layouts, to applying color or other channels to show attributes, to the use of layering and separation. These rules are accompanied by illustrative examples. We also provide a concise set of references and additional resources for each rule. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
12. Optimizing spatial allocation of seasonal influenza vaccine under temporal constraints.
- Author
-
Venkatramanan, Srinivasan, Chen, Jiangzhuo, Fadikar, Arindam, Gupta, Sandeep, Higdon, Dave, Lewis, Bryan, Marathe, Madhav, Mortveit, Henning, and Vullikanti, Anil
- Subjects
SEASONAL influenza ,INFLUENZA vaccines ,FLU vaccine efficacy ,HEALTH policy - Abstract
Prophylactic interventions such as vaccine allocation are some of the most effective public health policy planning tools. The supply of vaccines, however, is limited and an important challenge is to optimally allocate the vaccines to minimize epidemic impact. This resource allocation question (which we refer to as VID) has multiple dimensions: when, where, to whom, etc. Most of the existing literature in this topic deals with the latter (to whom), proposing policies that prioritize individuals by age and disease risk. However, since seasonal influenza spread has a typical spatial trend, and due to the temporal constraints enforced by the availability schedule, the when and where problems become equally, if not more, relevant. In this paper, we study the VID problem in the context of seasonal influenza spread in the United States. We develop a national scale metapopulation model for influenza that integrates both short and long distance human mobility, along with realistic data on vaccine uptake. We also design GA, a greedy algorithm for allocating the vaccine supply at the state level under temporal constraints and show that such a strategy improves over the current baseline of pro-rata allocation, and the improvement is more pronounced for higher vaccine efficacy and moderate flu season intensity. Further, the resulting strategy resembles a ring vaccination applied spatiallyacross the US. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
13. Identifying nonlinear dynamical systems via generative recurrent neural networks with applications to fMRI.
- Author
-
Koppe, Georgia, Toutounji, Hazem, Kirsch, Peter, Lis, Stefanie, and Durstewitz, Daniel
- Subjects
RECURRENT neural networks ,NONLINEAR dynamical systems ,LINEAR dynamical systems ,FUNCTIONAL magnetic resonance imaging ,DYNAMICAL systems - Abstract
A major tenet in theoretical neuroscience is that cognitive and behavioral processes are ultimately implemented in terms of the neural system dynamics. Accordingly, a major aim for the analysis of neurophysiological measurements should lie in the identification of the computational dynamics underlying task processing. Here we advance a state space model (SSM) based on generative piecewise-linear recurrent neural networks (PLRNN) to assess dynamics from neuroimaging data. In contrast to many other nonlinear time series models which have been proposed for reconstructing latent dynamics, our model is easily interpretable in neural terms, amenable to systematic dynamical systems analysis of the resulting set of equations, and can straightforwardly be transformed into an equivalent continuous-time dynamical system. The major contributions of this paper are the introduction of a new observation model suitable for functional magnetic resonance imaging (fMRI) coupled to the latent PLRNN, an efficient stepwise training procedure that forces the latent model to capture the ‘true’ underlying dynamics rather than just fitting (or predicting) the observations, and of an empirical measure based on the Kullback-Leibler divergence to evaluate from empirical time series how well this goal of approximating the underlying dynamics has been achieved. We validate and illustrate the power of our approach on simulated ‘ground-truth’ dynamical systems as well as on experimental fMRI time series, and demonstrate that the learnt dynamics harbors task-related nonlinear structure that a linear dynamical model fails to capture. Given that fMRI is one of the most common techniques for measuring brain activity non-invasively in human subjects, this approach may provide a novel step toward analyzing aberrant (nonlinear) dynamics for clinical assessment or neuroscientific research. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
14. Mathematical model predicts anti-adhesion–antibiotic–debridement combination therapies can clear an antibiotic resistant infection.
- Author
-
Roberts, Paul A., Huebinger, Ryan M., Keen, Emma, Krachler, Anne-Marie, and Jabbari, Sara
- Subjects
MULTIVALENT molecules ,ANTIBIOTICS ,MATHEMATICAL models ,THERAPEUTICS ,POLYSTYRENE ,ORDINARY differential equations - Abstract
As antimicrobial resistance increases, it is crucial to develop new treatment strategies to counter the emerging threat. In this paper, we consider combination therapies involving conventional antibiotics and debridement, coupled with a novel anti-adhesion therapy, and their use in the treatment of antimicrobial resistant burn wound infections. Our models predict that anti-adhesion–antibiotic–debridement combination therapies can eliminate a bacterial infection in cases where each treatment in isolation would fail. Antibiotics are assumed to have a bactericidal mode of action, killing bacteria, while debridement involves physically cleaning a wound (e.g. with a cloth); removing free bacteria. Anti-adhesion therapy can take a number of forms. Here we consider adhesion inhibitors consisting of polystyrene microbeads chemically coupled to a protein known as multivalent adhesion molecule 7, an adhesin which mediates the initial stages of attachment of many bacterial species to host cells. Adhesion inhibitors competitively inhibit bacteria from binding to host cells, thus rendering them susceptible to removal through debridement. An ordinary differential equation model is developed and the antibiotic-related parameters are fitted against new in vitro data gathered for the present study. The model is used to predict treatment outcomes and to suggest optimal treatment strategies. Our model predicts that anti-adhesion and antibiotic therapies will combine synergistically, producing a combined effect which is often greater than the sum of their individual effects, and that anti-adhesion–antibiotic–debridement combination therapy will be more effective than any of the treatment strategies used in isolation. Further, the use of inhibitors significantly reduces the minimum dose of antibiotics required to eliminate an infection, reducing the chances that bacteria will develop increased resistance. Lastly, we use our model to suggest treatment regimens capable of eliminating bacterial infections within clinically relevant timescales. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
15. Per-sample immunoglobulin germline inference from B cell receptor deep sequencing data.
- Author
-
Ralph, Duncan K. and IVMatsen, Frederick A.
- Subjects
B cell receptors ,IMMUNOGLOBULIN genes ,B cells ,ALLELES - Abstract
The collection of immunoglobulin genes in an individual’s germline, which gives rise to B cell receptors via recombination, is known to vary significantly across individuals. In humans, for example, each individual has only a fraction of the several hundred known V alleles. Furthermore, the currently-accepted set of known V alleles is both incomplete (particularly for non-European samples), and contains a significant number of spurious alleles. The resulting uncertainty as to which immunoglobulin alleles are present in any given sample results in inaccurate B cell receptor sequence annotations, and in particular inaccurate inferred naive ancestors. In this paper we first show that the currently widespread practice of aligning each sequence to its closest match in the full set of IMGT alleles results in a very large number of spurious alleles that are not in the sample’s true set of germline V alleles. We then describe a new method for inferring each individual’s germline gene set from deep sequencing data, and show that it improves upon existing methods by making a detailed comparison on a variety of simulated and real data samples. This new method has been integrated into the partis annotation and clonal family inference package, available at , and is run by default without affecting overall run time. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
16. Ensemble of decision tree reveals potential miRNA-disease associations.
- Author
-
Chen, Xing, Zhu, Chi-Chi, and Yin, Jun
- Subjects
DIMENSION reduction (Statistics) ,DECISION trees ,RENAL cancer ,THERAPEUTICS ,BREAST tumors ,MICRORNA - Abstract
In recent years, increasing associations between microRNAs (miRNAs) and human diseases have been identified. Based on accumulating biological data, many computational models for potential miRNA-disease associations inference have been developed, which saves time and expenditure on experimental studies, making great contributions to researching molecular mechanism of human diseases and developing new drugs for disease treatment. In this paper, we proposed a novel computational method named Ensemble of Decision Tree based MiRNA-Disease Association prediction (EDTMDA), which innovatively built a computational framework integrating ensemble learning and dimensionality reduction. For each miRNA-disease pair, the feature vector was extracted by calculating the statistical measures, graph theoretical measures, and matrix factorization results for the miRNA and disease, respectively. Then multiple base learnings were built to yield many decision trees (DTs) based on random selection of negative samples and miRNA/disease features. Particularly, Principal Components Analysis was applied to each base learning to reduce feature dimensionality and hence remove the noise or redundancy. Average strategy was adopted for these DTs to get final association scores between miRNAs and diseases. In model performance evaluation, EDTMDA showed AUC of 0.9309 in global leave-one-out cross validation (LOOCV) and AUC of 0.8524 in local LOOCV. Additionally, AUC of 0.9192+/-0.0009 in 5-fold cross validation proved the model’s reliability and stability. Furthermore, three types of case studies for four human diseases were implemented. As a result, 94% (Esophageal Neoplasms), 86% (Kidney Neoplasms), 96% (Breast Neoplasms) and 88% (Carcinoma Hepatocellular) of top 50 predicted miRNAs were confirmed by experimental evidences in literature. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
17. Primacy coding facilitates effective odor discrimination when receptor sensitivities are tuned.
- Author
-
Zwicker, David
- Subjects
ODORS ,BINARY codes ,COMPUTATIONAL biology ,COMPUTATIONAL neuroscience ,OLFACTORY receptors ,SENSORY perception - Abstract
The olfactory system faces the difficult task of identifying an enormous variety of odors independent of their intensity. Primacy coding, where the odor identity is encoded by the receptor types that respond earliest, might provide a compact and informative representation that can be interpreted efficiently by the brain. In this paper, we analyze the information transmitted by a simple model of primacy coding using numerical simulations and statistical descriptions. We show that the encoded information depends strongly on the number of receptor types included in the primacy representation, but only weakly on the size of the receptor repertoire. The representation is independent of the odor intensity and the transmitted information is useful to perform typical olfactory tasks with close to experimentally measured performance. Interestingly, we find situations in which a smaller receptor repertoire is advantageous for discriminating odors. The model also suggests that overly sensitive receptor types could dominate the entire response and make the whole array useless, which allows us to predict how receptor arrays need to adapt to stay useful during environmental changes. Taken together, we show that primacy coding is more useful than simple binary and normalized coding, essentially because the sparsity of the odor representations is independent of the odor statistics, in contrast to the alternatives. Primacy coding thus provides an efficient odor representation that is independent of the odor intensity and might thus help to identify odors in the olfactory cortex. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
18. Large vessels as a tree of transmission lines incorporated in the CircAdapt whole-heart model: A computational tool to examine heart-vessel interaction.
- Author
-
Heusinkveld, Maarten H. G., Huberts, Wouter, Lumens, Joost, Arts, Theo, Delhaas, Tammo, and Reesink, Koen D.
- Subjects
ELECTRIC lines ,TIMBERLINE ,SYSTOLIC blood pressure ,BLOOD pressure ,CAROTID artery ,HEMODYNAMICS - Abstract
We developed a whole-circulation computational model by integrating a transmission line (TL) model describing vascular wave transmission into the established CircAdapt platform of whole-heart mechanics. In the present paper, we verify the numerical framework of our TL model by benchmark comparison to a previously validated pulse wave propagation (PWP) model. Additionally, we showcase the integrated CircAdapt–TL model, which now includes the heart as well as extensive arterial and venous trees with terminal impedances. We present CircAdapt–TL haemodynamics simulations of: 1) a systemic normotensive situation and 2) a systemic hypertensive situation. In the TL–PWP benchmark comparison we found good agreement regarding pressure and flow waveforms (relative errors ≤ 2.9% for pressure, and ≤ 5.6% for flow). CircAdapt–TL simulations reproduced the typically observed haemodynamic changes with hypertension, expressed by increases in mean and pulsatile blood pressures, and increased arterial pulse wave velocity. We observed a change in the timing of pressure augmentation (defined as a late-systolic boost in aortic pressure) from occurring after time of peak systolic pressure in the normotensive situation, to occurring prior to time of peak pressure in the hypertensive situation. The pressure augmentation could not be observed when the systemic circulation was lumped into a (non-linear) three-element windkessel model, instead of using our TL model. Wave intensity analysis at the carotid artery indicated earlier arrival of reflected waves with hypertension as compared to normotension, in good qualitative agreement with findings in patients. In conclusion, we successfully embedded a TL model as a vascular module into the CircAdapt platform. The integrated CircAdapt–TL model allows detailed studies on mechanistic studies on heart-vessel interaction. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
19. Properties of cardiac conduction in a cell-based computational model.
- Author
-
Jæger, Karoline Horgmo, Edwards, Andrew G., McCulloch, Andrew, and Tveito, Aslak
- Subjects
CARDIAC arrest ,HEART cells ,HEART conduction system ,COMPUTATIONAL acoustics ,SODIUM channels - Abstract
The conduction of electrical signals through cardiac tissue is essential for maintaining the function of the heart, and conduction abnormalities are known to potentially lead to life-threatening arrhythmias. The properties of cardiac conduction have therefore been the topic of intense study for decades, but a number of questions related to the mechanisms of conduction still remain unresolved. In this paper, we demonstrate how the so-called EMI model may be used to study some of these open questions. In the EMI model, the extracellular space, the cell membrane, the intracellular space and the cell connections are all represented as separate parts of the computational domain, and the model therefore allows for study of local properties that are hard to represent in the classical homogenized bidomain or monodomain models commonly used to study cardiac conduction. We conclude that a non-uniform sodium channel distribution increases the conduction velocity and decreases the time delays over gap junctions of reduced coupling in the EMI model simulations. We also present a theoretical optimal cell length with respect to conduction velocity and consider the possibility of ephaptic coupling (i.e. cell-to-cell coupling through the extracellular potential) acting as an alternative or supporting mechanism to gap junction coupling. We conclude that for a non-uniform distribution of sodium channels and a sufficiently small intercellular distance, ephaptic coupling can influence the dynamics of the sodium channels and potentially provide cell-to-cell coupling when the gap junction connection is absent. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
20. A neuromechanistic model for rhythmic beat generation.
- Author
-
Bose, Amitabha, Byrne, Áine, and Rinzel, John
- Subjects
SYNCHRONIZATION ,TIME measurements ,NOISE ,FREQUENCIES of oscillating systems ,OSCILLATIONS - Abstract
When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat. In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech. Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus. The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence. The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization. Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase. Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
21. A Bayesian framework for the analysis of systems biology models of the brain.
- Author
-
Russell-Buckland, Joshua, Barnes, Christopher P., and Tachtsidis, Ilias
- Subjects
BAYESIAN analysis ,BRAIN physiology ,SYSTEMS biology ,SENSITIVITY analysis ,MODELS & modelmaking - Abstract
Systems biology models are used to understand complex biological and physiological systems. Interpretation of these models is an important part of developing this understanding. These models are often fit to experimental data in order to understand how the system has produced various phenomena or behaviour that are seen in the data. In this paper, we have outlined a framework that can be used to perform Bayesian analysis of complex systems biology models. In particular, we have focussed on analysing a systems biology of the brain using both simulated and measured data. By using a combination of sensitivity analysis and approximate Bayesian computation, we have shown that it is possible to obtain distributions of parameters that can better guard against misinterpretation of results, as compared to a maximum likelihood estimate based approach. This is done through analysis of simulated and experimental data. NIRS measurements were simulated using the same simulated systemic input data for the model in a ‘healthy’ and ‘impaired’ state. By analysing both of these datasets, we show that different parameter spaces can be distinguished and compared between different physiological states or conditions. Finally, we analyse experimental data using the new Bayesian framework and the previous maximum likelihood estimate approach, showing that the Bayesian approach provides a more complete understanding of the parameter space. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
22. A kinetic model for Brain-Derived Neurotrophic Factor mediated spike timing-dependent LTP.
- Author
-
Solinas, Sergio M. G., Edelmann, Elke, Leßmann, Volkmar, and Migliore, Michele
- Subjects
NEUROTROPHINS ,MAMMALS ,NERVOUS system ,NEURONS ,NEUROLOGY - Abstract
Across the mammalian nervous system, neurotrophins control synaptic plasticity, neuromodulation, and neuronal growth. The neurotrophin Brain Derived Neurotrophic Factor (BDNF) is known to promote structural and functional synaptic plasticity in the hippocampus, the cerebral cortex, and many other brain areas. In recent years, a wealth of data has been accumulated revealing the paramount importance of BDNF for neuronal function. BDNF signaling gives rise to multiple complex signaling pathways that mediate neuronal survival and differentiation during development, and formation of new memories. These different roles of BDNF for neuronal function have essential consequences if BDNF signaling in the brain is reduced. Thus, BDNF knock-out mice or mice that are deficient in BDNF receptor signaling via TrkB and p75 receptors show deficits in neuronal development, synaptic plasticity, and memory formation. Accordingly, BDNF signaling dysfunctions are associated with many neurological and neurodegenerative conditions including Alzheimer's and Huntington's disease. However, despite the widespread implications of BDNF-dependent signaling in synaptic plasticity in healthy and pathological conditions, the interplay of the involved different biochemical pathways at the synaptic level remained mostly unknown. In this paper, we investigated the role of BDNF/TrkB signaling in spike-timing dependent plasticity (STDP) in rodent hippocampus CA1 pyramidal cells, by implementing the first subcellular model of BDNF regulated, spike timing-dependent long-term potentiation (t-LTP). The model is based on previously published experimental findings on STDP and accounts for the observed magnitude, time course, stimulation pattern and BDNF-dependence of t-LTP. It allows interpreting the main experimental findings concerning specific biomolecular processes, and it can be expanded to take into account more detailed biochemical reactions. The results point out a few predictions on how to enhance LTP induction in such a way to rescue or improve cognitive functions under pathological conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
23. Model diagnostics and refinement for phylodynamic models.
- Author
-
Lau, Max SY, Grenfell, Bryan T, Worby, Colin J, and Gibson, Gavin J
- Subjects
EPIDEMIOLOGY ,GENOMICS ,PATHOGENIC microorganisms ,BIOLOGICAL evolution ,LIFE sciences ,SUPERSPREADING events - Abstract
Phylodynamic modelling, which studies the joint dynamics of epidemiological and evolutionary processes, has made significant progress in recent years due to increasingly available genomic data and advances in statistical modelling. These advances have greatly improved our understanding of transmission dynamics of many important pathogens. Nevertheless, there remains a lack of effective, targetted diagnostic tools for systematically detecting model mis-specification. Development of such tools is essential for model criticism, refinement, and calibration. The idea of utilising latent residuals for model assessment has already been exploited in general spatio-temporal epidemiological settings. Specifically, by proposing appropriately designed non-centered, re-parameterizations of a given epidemiological process, one can construct latent residuals with known sampling distributions which can be used to quantify evidence of model mis-specification. In this paper, we extend this idea to formulate a novel model-diagnostic framework for phylodynamic models. Using simulated examples, we show that our framework may effectively detect a particular form of mis-specification in a phylodynamic model, particularly in the event of superspreading. We also exemplify our approach by applying the framework to a dataset describing a local foot-and-mouth (FMD) outbreak in the UK, eliciting strong evidence against the assumption of no within-host-diversity in the outbreak. We further demonstrate that our framework can facilitate model calibration in real-life scenarios, by proposing a within-host-diversity model which appears to offer a better fit to data than one that assumes no within-host-diversity of FMD virus. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
24. LMTRDA: Using logistic model tree to predict MiRNA-disease associations by fusing multi-source information of sequences and similarities.
- Author
-
Wang, Lei, You, Zhu-Hong, Chen, Xing, Li, Yang-Ming, Dong, Ya-Nan, Li, Li-Ping, and Zheng, Kai
- Subjects
LOGISTIC model (Demography) ,MICRORNA ,MEDICAL genetics ,RNA sequencing ,PREDICTION models ,BREAST tumors ,NATURAL language processing ,LYMPHOMA diagnosis - Abstract
Emerging evidence has shown microRNAs (miRNAs) play an important role in human disease research. Identifying potential association among them is significant for the development of pathology, diagnose and therapy. However, only a tiny portion of all miRNA-disease pairs in the current datasets are experimentally validated. This prompts the development of high-precision computational methods to predict real interaction pairs. In this paper, we propose a new model of Logistic Model Tree for predicting miRNA-Disease Association (LMTRDA) by fusing multi-source information including miRNA sequences, miRNA functional similarity, disease semantic similarity, and known miRNA-disease associations. In particular, we introduce miRNA sequence information and extract its features using natural language processing technique for the first time in the miRNA-disease prediction model. In the cross-validation experiment, LMTRDA obtained 90.51% prediction accuracy with 92.55% sensitivity at the AUC of 90.54% on the HMDD V3.0 dataset. To further evaluate the performance of LMTRDA, we compared it with different classifier and feature descriptor models. In addition, we also validate the predictive ability of LMTRDA in human diseases including Breast Neoplasms, Breast Neoplasms and Lymphoma. As a result, 28, 27 and 26 out of the top 30 miRNAs associated with these diseases were verified by experiments in different kinds of case studies. These experimental results demonstrate that LMTRDA is a reliable model for predicting the association among miRNAs and diseases. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
25. A numerical approach for a discrete Markov model for progressing drug resistance of cancer.
- Author
-
Maeda, Masayuki and Yamashita, Hideaki
- Subjects
MARKOV processes ,DRUG resistance ,CANCER treatment ,COMPUTER simulation ,PROBABILITY theory - Abstract
The presence of treatment-resistant cells is an important factor that limits the efficacy of cancer therapy, and the prospect of resistance is considered the major cause of the treatment strategy. Several recent studies have employed mathematical models to elucidate the dynamics of generating resistant cancer cells and attempted to predict the probability of emerging resistant cells. The purpose of this paper is to present numerical approach to compute the number of resistant cells and the emerging probability of resistance. Stochastic model was designed and developed a method to approximately but efficiently compute the number of resistant cells and the probability of resistance. To model the progression of cancer, a discrete-state, two-dimensional Markov process whose states are the total number of cells and the number of resistant cells was employed. Then exact analysis and approximate aggregation approaches were proposed to calculate the number of resistant cells and the probability of resistance when the cell population reaches detection size. To confirm the accuracy of computed results of approximation, relative errors between exact analysis and approximation were computed. The numerical values of our approximation method were very close to those of exact analysis calculated in the range of small detection size M = 500, 100, and 1500. Then computer simulation was performed to confirm the accuracy of computed results of approximation when the detection size was M = 10000,30000,50000,100000 and 1000000. All the numerical results of approximation fell between the upper level and the lower level of 95% confidential intervals and our method took less time to compute over a broad range of cell size. The effects of parameter change on emerging probabilities of resistance were also investigated by computed values using approximation method. The results showed that the number of divisions until the cell population reached the detection size is important for emerging the probability of resistance. The next step of numerical approach is to compute the emerging probabilities of resistance under drug administration and with multiple mutation. Another effective approximation would be necessary for the analysis of the latter case. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
26. A data-driven interactome of synergistic genes improves network-based cancer outcome prediction.
- Author
-
Allahyar, Amin, Ubels, Joske, and de Ridder, Jeroen
- Subjects
CANCER patients ,GENE expression ,CANCER treatment ,HEALTH outcome assessment ,MOLECULAR genetics - Abstract
Robustly predicting outcome for cancer patients from gene expression is an important challenge on the road to better personalized treatment. Network-based outcome predictors (NOPs), which considers the cellular wiring diagram in the classification, hold much promise to improve performance, stability and interpretability of identified marker genes. Problematically, reports on the efficacy of NOPs are conflicting and for instance suggest that utilizing random networks performs on par to networks that describe biologically relevant interactions. In this paper we turn the prediction problem around: instead of using a given biological network in the NOP, we aim to identify the network of genes that truly improves outcome prediction. To this end, we propose SyNet, a gene network constructed ab initio from synergistic gene pairs derived from survival-labelled gene expression data. To obtain SyNet, we evaluate synergy for all 69 million pairwise combinations of genes resulting in a network that is specific to the dataset and phenotype under study and can be used to in a NOP model. We evaluated SyNet and 11 other networks on a compendium dataset of >4000 survival-labelled breast cancer samples. For this purpose, we used cross-study validation which more closely emulates real world application of these outcome predictors. We find that SyNet is the only network that truly improves performance, stability and interpretability in several existing NOPs. We show that SyNet overlaps significantly with existing gene networks, and can be confidently predicted (~85% AUC) from graph-topological descriptions of these networks, in particular the breast tissue-specific network. Due to its data-driven nature, SyNet is not biased to well-studied genes and thus facilitates post-hoc interpretation. We find that SyNet is highly enriched for known breast cancer genes and genes related to e.g. histological grade and tamoxifen resistance, suggestive of a role in determining breast cancer outcome. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
27. On variational solutions for whole brain serial-section histology using a Sobolev prior in the computational anatomy random orbit model.
- Author
-
Lee, Brian C., Tward, Daniel J., Mitra, Partha P., and Miller, Michael I.
- Subjects
HISTOLOGICAL techniques ,DIFFEOMORPHISMS ,HISTOLOGY ,BRAIN ,MICE - Abstract
This paper presents a variational framework for dense diffeomorphic atlas-mapping onto high-throughput histology stacks at the 20 μm meso-scale. The observed sections are modelled as Gaussian random fields conditioned on a sequence of unknown section by section rigid motions and unknown diffeomorphic transformation of a three-dimensional atlas. To regularize over the high-dimensionality of our parameter space (which is a product space of the rigid motion dimensions and the diffeomorphism dimensions), the histology stacks are modelled as arising from a first order Sobolev space smoothness prior. We show that the joint maximum a-posteriori, penalized-likelihood estimator of our high dimensional parameter space emerges as a joint optimization interleaving rigid motion estimation for histology restacking and large deformation diffeomorphic metric mapping to atlas coordinates. We show that joint optimization in this parameter space solves the classical curvature non-identifiability of the histology stacking problem. The algorithms are demonstrated on a collection of whole-brain histological image stacks from the Mouse Brain Architecture Project. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
28. Bayesian adaptive dual control of deep brain stimulation in a computational model of Parkinson’s disease.
- Author
-
Grado, Logan L., Johnson, Matthew D., and Netoff, Theoden I.
- Subjects
BAYESIAN analysis ,PROBABILITY theory ,BRAIN stimulation ,KINDLING (Neurology) ,TRANSCRANIAL magnetic stimulation - Abstract
In this paper, we present a novel Bayesian adaptive dual controller (ADC) for autonomously programming deep brain stimulation devices. We evaluated the Bayesian ADC’s performance in the context of reducing beta power in a computational model of Parkinson’s disease, in which it was tasked with finding the set of stimulation parameters which optimally reduced beta power as fast as possible. Here, the Bayesian ADC has dual goals: (a) to minimize beta power by exploiting the best parameters found so far, and (b) to explore the space to find better parameters, thus allowing for better control in the future. The Bayesian ADC is composed of two parts: an inner parameterized feedback stimulator and an outer parameter adjustment loop. The inner loop operates on a short time scale, delivering stimulus based upon the phase and power of the beta oscillation. The outer loop operates on a long time scale, observing the effects of the stimulation parameters and using Bayesian optimization to intelligently select new parameters to minimize the beta power. We show that the Bayesian ADC can efficiently optimize stimulation parameters, and is superior to other optimization algorithms. The Bayesian ADC provides a robust and general framework for tuning stimulation parameters, can be adapted to use any feedback signal, and is applicable across diseases and stimulator designs. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
29. Predicting B cell receptor substitution profiles using public repertoire data.
- Author
-
Dhar, Amrit, Davidsen, Kristian, IVMatsen, Frederick A., and Minin, Vladimir N.
- Subjects
B cell receptors ,AMINO acids ,GENETIC mutation ,CLONING ,GERMINAL centers ,IMMUNOTECHNOLOGY - Abstract
B cells develop high affinity receptors during the course of affinity maturation, a cyclic process of mutation and selection. At the end of affinity maturation, a number of cells sharing the same ancestor (i.e. in the same “clonal family”) are released from the germinal center; their amino acid frequency profile reflects the allowed and disallowed substitutions at each position. These clonal-family-specific frequency profiles, called “substitution profiles”, are useful for studying the course of affinity maturation as well as for antibody engineering purposes. However, most often only a single sequence is recovered from each clonal family in a sequencing experiment, making it impossible to construct a clonal-family-specific substitution profile. Given the public release of many high-quality large B cell receptor datasets, one may ask whether it is possible to use such data in a prediction model for clonal-family-specific substitution profiles. In this paper, we present the method “Substitution Profiles Using Related Families” (SPURF), a penalized tensor regression framework that integrates information from a rich assemblage of datasets to predict the clonal-family-specific substitution profile for any single input sequence. Using this framework, we show that substitution profiles from similar clonal families can be leveraged together with simulated substitution profiles and germline gene sequence information to improve prediction. We fit this model on a large public dataset and validate the robustness of our approach on two external datasets. Furthermore, we provide a command-line tool in an open-source software package () implementing these ideas and providing easy prediction using our pre-fit models. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
30. A minimally invasive neurostimulation method for controlling abnormal synchronisation in the neuronal activity.
- Author
-
Asllani, Malbor, Expert, Paul, and Carletti, Timoteo
- Subjects
NEURAL stimulation ,SYNCHRONIC order ,NEURAL physiology ,PARKINSON'S disease ,CONTROL theory (Engineering) - Abstract
Many collective phenomena in Nature emerge from the -partial- synchronisation of the units comprising a system. In the case of the brain, this self-organised process allows groups of neurons to fire in highly intricate partially synchronised patterns and eventually lead to high level cognitive outputs and control over the human body. However, when the synchronisation patterns are altered and hypersynchronisation occurs, undesirable effects can occur. This is particularly striking and well documented in the case of epileptic seizures and tremors in neurodegenerative diseases such as Parkinson’s disease. In this paper, we propose an innovative, minimally invasive, control method that can effectively desynchronise misfiring brain regions and thus mitigate and even eliminate the symptoms of the diseases. The control strategy, grounded in the Hamiltonian control theory, is applied to ensembles of neurons modelled via the Kuramoto or the Stuart-Landau models and allows for heterogeneous coupling among the interacting unities. The theory has been complemented with dedicated numerical simulations performed using the small-world Newman-Watts network and the random Erdős-Rényi network. Finally the method has been compared with the gold-standard Proportional-Differential Feedback control technique. Our method is shown to achieve equivalent levels of desynchronisation using lesser control strength and/or fewer controllers, being thus minimally invasive. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
31. Age density patterns in patients medical conditions: A clustering approach.
- Author
-
Alhasoun, Fahad, Aleissa, Faisal, Alhazzani, May, Moyano, Luis G., Pinhanez, Claudio, and González, Marta C.
- Subjects
HEALTH facilities ,POPULATION biology ,POPULATION density ,MEDICAL care ,COMMUNICABLE diseases ,CHICKENPOX - Abstract
This paper presents a data analysis framework to uncover relationships between health conditions, age and sex for a large population of patients. We study a massive heterogeneous sample of 1.7 million patients in Brazil, containing 47 million of health records with detailed medical conditions for visits to medical facilities for a period of 17 months. The findings suggest that medical conditions can be grouped into clusters that share very distinctive densities in the ages of the patients. For each cluster, we further present the ICD-10 chapters within it. Finally, we relate the findings to comorbidity networks, uncovering the relation of the discovered clusters of age densities to comorbidity networks literature. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
32. The role of intracellular signaling in the stripe formation in engineered Escherichia coli populations.
- Author
-
Xue, Xiaoru, Xue, Chuan, and Tang, Min
- Subjects
ESCHERICHIA coli enzymes ,ESCHERICHIA coli proteins ,ESCHERICHIA coli physiology ,COMPUTATIONAL biology ,CELL division - Abstract
Recent experiments showed that engineered Escherichia coli colonies grow and self-organize into periodic stripes with high and low cell densities in semi-solid agar. The stripes develop sequentially behind a radially propagating colony front, similar to the formation of many other periodic patterns in nature. These bacteria were created by genetically coupling the intracellular chemotaxis pathway of wild-type cells with a quorum sensing module through the protein CheZ. In this paper, we develop multiscale models to investigate how this intracellular pathway affects stripe formation. We first develop a detailed hybrid model that treats each cell as an individual particle and incorporates intracellular signaling via an internal ODE system. To overcome the computational cost of the hybrid model caused by the large number of cells involved, we next derive a mean-field PDE model from the hybrid model using asymptotic analysis. We show that this analysis is justified by the tight agreement between the PDE model and the hybrid model in 1D simulations. Numerical simulations of the PDE model in 2D with radial symmetry agree with experimental data semi-quantitatively. Finally, we use the PDE model to make a number of testable predictions on how the stripe patterns depend on cell-level parameters, including cell speed, cell doubling time and the turnover rate of intracellular CheZ. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
33. Minimal model of interictal and ictal discharges “Epileptor-2”.
- Author
-
Chizhov, Anton V., Zefirov, Artyom V., Amakhin, Dmitry V., Smirnova, Elena Yu., and Zaitsev, Aleksey V.
- Subjects
INTERNEURONS ,MEMBRANE potential ,ACTION potentials ,POTASSIUM ,NEUROSCIENCES - Abstract
Seizures occur in a recurrent manner with intermittent states of interictal and ictal discharges (IIDs and IDs). The transitions to and from IDs are determined by a set of processes, including synaptic interaction and ionic dynamics. Although mathematical models of separate types of epileptic discharges have been developed, modeling the transitions between states remains a challenge. A simple generic mathematical model of seizure dynamics (Epileptor) has recently been proposed by Jirsa et al. (2014); however, it is formulated in terms of abstract variables. In this paper, a minimal population-type model of IIDs and IDs is proposed that is as simple to use as the Epileptor, but the suggested model attributes physical meaning to the variables. The model is expressed in ordinary differential equations for extracellular potassium and intracellular sodium concentrations, membrane potential, and short-term synaptic depression variables. A quadratic integrate-and-fire model driven by the population input current is used to reproduce spike trains in a representative neuron. In simulations, potassium accumulation governs the transition from the silent state to the state of an ID. Each ID is composed of clustered IID-like events. The sodium accumulates during discharge and activates the sodium-potassium pump, which terminates the ID by restoring the potassium gradient and thus polarizing the neuronal membranes. The whole-cell and cell-attached recordings of a 4-AP-based in vitro model of epilepsy confirmed the primary model assumptions and predictions. The mathematical analysis revealed that the IID-like events are large-amplitude stochastic oscillations, which in the case of ID generation are controlled by slow oscillations of ionic concentrations. The IDs originate in the conditions of elevated potassium concentrations in a bath solution via a saddle-node-on-invariant-circle-like bifurcation for a non-smooth dynamical system. By providing a minimal biophysical description of ionic dynamics and network interactions, the model may serve as a hierarchical base from a simple to more complex modeling of seizures. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
34. Simulations to benchmark time-varying connectivity methods for fMRI.
- Author
-
Thompson, William Hedley, Richter, Craig Geoffrey, Plavén-Sigray, Pontus, and Fransson, Peter
- Subjects
FUNCTIONAL magnetic resonance imaging ,BRAIN imaging ,SIMULATION methods & models ,MULTIPLICATION ,ANALYSIS of covariance - Abstract
There is a current interest in quantifying time-varying connectivity (TVC) based on neuroimaging data such as fMRI. Many methods have been proposed, and are being applied, revealing new insight into the brain’s dynamics. However, given that the ground truth for TVC in the brain is unknown, many concerns remain regarding the accuracy of proposed estimates. Since there exist many TVC methods it is difficult to assess differences in time-varying connectivity between studies. In this paper, we present tvc_benchmarker, which is a Python package containing four simulations to test TVC methods. Here, we evaluate five different methods that together represent a wide spectrum of current approaches to estimating TVC (sliding window, tapered sliding window, multiplication of temporal derivatives, spatial distance and jackknife correlation). These simulations were designed to test each method’s ability to track changes in covariance over time, which is a key property in TVC analysis. We found that all tested methods correlated positively with each other, but there were large differences in the strength of the correlations between methods. To facilitate comparisons with future TVC methods, we propose that the described simulations can act as benchmark tests for evaluation of methods. Using tvc_benchmarker researchers can easily add, compare and submit their own TVC methods to evaluate its performance. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
35. Potassium and sodium microdomains in thin astroglial processes: A computational model study.
- Author
-
Breslin, Kevin, Wade, John Joseph, Harkin, Jim, Flanagan, Bronac, McDaid, Liam, Wong-Lin, KongFatt, Van Zalinge, Harm, Hall, Steve, Walker, Matthew, and Verkhratsky, Alexei
- Subjects
BIOLOGICAL mathematical modeling ,HOMEOSTASIS ,EXTRACELLULAR space ,ASTROCYTES ,CENTRAL nervous system ,NEURAL transmission ,GABA - Abstract
A biophysical model that captures molecular homeostatic control of ions at the perisynaptic cradle (PsC) is of fundamental importance for understanding the interplay between astroglial and neuronal compartments. In this paper, we develop a multi-compartmental mathematical model which proposes a novel mechanism whereby the flow of cations in thin processes is restricted due to negatively charged membrane lipids which result in the formation of deep potential wells near the dipole heads. These wells restrict the flow of cations to “hopping” between adjacent wells as they transverse the process, and this surface retention of cations will be shown to give rise to the formation of potassium (K
+ ) and sodium (Na+ ) microdomains at the PsC. We further propose that a K+ microdomain formed at the PsC, provides the driving force for the return of K+ to the extracellular space for uptake by the neurone, thereby preventing K+ undershoot. A slow decay of Na+ was also observed in our simulation after a period of glutamate stimulation which is in strong agreement with experimental observations. The pathological implications of microdomain formation during neuronal excitation are also discussed. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
36. Predictive modelling of a novel anti-adhesion therapy to combat bacterial colonisation of burn wounds.
- Author
-
Roberts, Paul A., Huebinger, Ryan M., Keen, Emma, Krachler, Anne-Marie, and Jabbari, Sara
- Subjects
TREATMENT for burns & scalds ,ANTIBIOTICS ,DRUG resistance in bacteria ,COLONIZATION (Ecology) ,DRUG development - Abstract
As the development of new classes of antibiotics slows, bacterial resistance to existing antibiotics is becoming an increasing problem. A potential solution is to develop treatment strategies with an alternative mode of action. We consider one such strategy: anti-adhesion therapy. Whereas antibiotics act directly upon bacteria, either killing them or inhibiting their growth, anti-adhesion therapy impedes the binding of bacteria to host cells. This prevents bacteria from deploying their arsenal of virulence mechanisms, while simultaneously rendering them more susceptible to natural and artificial clearance. In this paper, we consider a particular form of anti-adhesion therapy, involving biomimetic multivalent adhesion molecule 7 coupled polystyrene microbeads, which competitively inhibit the binding of bacteria to host cells. We develop a mathematical model, formulated as a system of ordinary differential equations, to describe inhibitor treatment of a Pseudomonas aeruginosa burn wound infection in the rat. Benchmarking our model against in vivo data from an ongoing experimental programme, we use the model to explain bacteria population dynamics and to predict the efficacy of a range of treatment strategies, with the aim of improving treatment outcome. The model consists of two physical compartments: the host cells and the exudate. It is found that, when effective in reducing the bacterial burden, inhibitor treatment operates both by preventing bacteria from binding to the host cells and by reducing the flux of daughter cells from the host cells into the exudate. Our model predicts that inhibitor treatment cannot eliminate the bacterial burden when used in isolation; however, when combined with regular or continuous debridement of the exudate, elimination is theoretically possible. Lastly, we present ways to improve therapeutic efficacy, as predicted by our mathematical model. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
37. Correcting for batch effects in case-control microbiome studies.
- Author
-
Gibbons, Sean M., Duvallet, Claire, and Alm, Eric J.
- Subjects
CASE-control method ,MICROARRAY technology ,RNA ,MICROBIAL genomics - Abstract
High-throughput data generation platforms, like mass-spectrometry, microarrays, and second-generation sequencing are susceptible to batch effects due to run-to-run variation in reagents, equipment, protocols, or personnel. Currently, batch correction methods are not commonly applied to microbiome sequencing datasets. In this paper, we compare different batch-correction methods applied to microbiome case-control studies. We introduce a model-free normalization procedure where features (i.e. bacterial taxa) in case samples are converted to percentiles of the equivalent features in control samples within a study prior to pooling data across studies. We look at how this percentile-normalization method compares to traditional meta-analysis methods for combining independent p-values and to limma and ComBat, widely used batch-correction models developed for RNA microarray data. Overall, we show that percentile-normalization is a simple, non-parametric approach for correcting batch effects and improving sensitivity in case-control meta-analyses. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
38. A multitask clustering approach for single-cell RNA-seq analysis in Recessive Dystrophic Epidermolysis Bullosa.
- Author
-
Zhang, Huanan, Lee, Catherine A. A., Li, Zhuliu, Garbe, John R., Eide, Cindy R., Petegrosso, Raphael, Kuang, Rui, and Tolar, Jakub
- Subjects
EPIDERMOLYSIS bullosa ,RNA sequencing ,FLOW cytometry ,BIOMARKERS ,GENE expression - Abstract
Single-cell RNA sequencing (scRNA-seq) has been widely applied to discover new cell types by detecting sub-populations in a heterogeneous group of cells. Since scRNA-seq experiments have lower read coverage/tag counts and introduce more technical biases compared to bulk RNA-seq experiments, the limited number of sampled cells combined with the experimental biases and other dataset specific variations presents a challenge to cross-dataset analysis and discovery of relevant biological variations across multiple cell populations. In this paper, we introduce a method of variance-driven multitask clustering of single-cell RNA-seq data (scVDMC) that utilizes multiple single-cell populations from biological replicates or different samples. scVDMC clusters single cells in multiple scRNA-seq experiments of similar cell types and markers but varying expression patterns such that the scRNA-seq data are better integrated than typical pooled analyses which only increase the sample size. By controlling the variance among the cell clusters within each dataset and across all the datasets, scVDMC detects cell sub-populations in each individual experiment with shared cell-type markers but varying cluster centers among all the experiments. Applied to two real scRNA-seq datasets with several replicates and one large-scale Drop-seq dataset on three patient samples, scVDMC more accurately detected cell populations and known cell markers than pooled clustering and other recently proposed scRNA-seq clustering methods. In the case study applied to in-house Recessive Dystrophic Epidermolysis Bullosa (RDEB) scRNA-seq data, scVDMC revealed several new cell types and unknown markers validated by flow cytometry. MATLAB/Octave code available at . [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
39. In silico study of multicellular automaticity of heterogeneous cardiac cell monolayers: Effects of automaticity strength and structural linear anisotropy.
- Author
-
Duverger, James Elber, Jacquemet, Vincent, Vinet, Alain, and Comtois, Philippe
- Subjects
HEART cells ,AUTOMATICITY (Learning process) ,ANISOTROPY ,MYOCARDIUM ,PACEMAKER cells - Abstract
The biological pacemaker approach is an alternative to cardiac electronic pacemakers. Its main objective is to create pacemaking activity from added or modified distribution of spontaneous cells in the myocardium. This paper aims to assess how automaticity strength of pacemaker cells (i.e. their ability to maintain robust spontaneous activity with fast rate and to drive neighboring quiescent cells) and structural linear anisotropy, combined with density and spatial distribution of pacemaker cells, may affect the macroscopic behavior of the biological pacemaker. A stochastic algorithm was used to randomly distribute pacemaker cells, with various densities and spatial distributions, in a semi-continuous mathematical model. Simulations of the model showed that stronger automaticity allows onset of spontaneous activity for lower densities and more homogeneous spatial distributions, displayed more central foci, less variability in cycle lengths and synchronization of electrical activation for similar spatial patterns, but more variability in those same variables for dissimilar spatial patterns. Compared their isotropic counterparts, in silico anisotropic monolayers had less central foci and displayed more variability in cycle lengths and synchronization of electrical activation for both similar and dissimilar spatial patterns. The present study established a link between microscopic structure and macroscopic behavior of the biological pacemaker, and may provide crucial information for optimized biological pacemaker therapies. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
40. Life cycle synchronization is a viral drug resistance mechanism.
- Author
-
Neagu, Iulia A., Olejarz, Jason, Freeman, Mark, Rosenbloom, Daniel I.S., Nowak, Martin A., and Hill, Alison L.
- Subjects
ANTIVIRAL agents ,VIRUS diseases ,LIFE cycles (Biology) ,DRUG resistance ,DRUG tolerance - Abstract
Viral infections are one of the major causes of death worldwide, with HIV infection alone resulting in over 1.2 million casualties per year. Antiviral drugs are now being administered for a variety of viral infections, including HIV, hepatitis B and C, and influenza. These therapies target a specific phase of the virus’s life cycle, yet their ultimate success depends on a variety of factors, such as adherence to a prescribed regimen and the emergence of viral drug resistance. The epidemiology and evolution of drug resistance have been extensively characterized, and it is generally assumed that drug resistance arises from mutations that alter the virus’s susceptibility to the direct action of the drug. In this paper, we consider the possibility that a virus population can evolve towards synchronizing its life cycle with the pattern of drug therapy. The periodicity of the drug treatment could then allow for a virus strain whose life cycle length is a multiple of the dosing interval to replicate only when the concentration of the drug is lowest. This process, referred to as “drug tolerance by synchronization”, could allow the virus population to maximize its overall fitness without having to alter drug binding or complete its life cycle in the drug’s presence. We use mathematical models and stochastic simulations to show that life cycle synchronization can indeed be a mechanism of viral drug tolerance. We show that this effect is more likely to occur when the variability in both viral life cycle and drug dose timing are low. More generally, we find that in the presence of periodic drug levels, time-averaged calculations of viral fitness do not accurately predict drug levels needed to eradicate infection, even if there is no synchronization. We derive an analytical expression for viral fitness that is sufficient to explain the drug-pattern-dependent survival of strains with any life cycle length. We discuss the implications of these findings for clinically relevant antiviral strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
41. A phylogenetic method to perform genome-wide association studies in microbes that accounts for population structure and recombination.
- Author
-
Collins, Caitlin and Didelot, Xavier
- Subjects
PHYLOGENY ,MICROORGANISMS ,NEISSERIA meningitidis ,PENICILLIN ,DRUG resistance in bacteria - Abstract
Genome-Wide Association Studies (GWAS) in microbial organisms have the potential to vastly improve the way we understand, manage, and treat infectious diseases. Yet, microbial GWAS methods established thus far remain insufficiently able to capitalise on the growing wealth of bacterial and viral genetic sequence data. Facing clonal population structure and homologous recombination, existing GWAS methods struggle to achieve both the precision necessary to reject spurious findings and the power required to detect associations in microbes. In this paper, we introduce a novel phylogenetic approach that has been tailor-made for microbial GWAS, which is applicable to organisms ranging from purely clonal to frequently recombining, and to both binary and continuous phenotypes. Our approach is robust to the confounding effects of both population structure and recombination, while maintaining high statistical power to detect associations. Thorough testing via application to simulated data provides strong support for the power and specificity of our approach and demonstrates the advantages offered over alternative cluster-based and dimension-reduction methods. Two applications to Neisseria meningitidis illustrate the versatility and potential of our method, confirming previously-identified penicillin resistance loci and resulting in the identification of both well-characterised and novel drivers of invasive disease. Our method is implemented as an open-source R package called treeWAS which is freely available at . [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
42. What drives the perceptual change resulting from speech motor adaptation? Evaluation of hypotheses in a Bayesian modeling framework.
- Author
-
Patri, Jean-François, Perrier, Pascal, Schwartz, Jean-Luc, and Diard, Julien
- Subjects
MOTOR ability ,MOTOR ability testing ,SPEECH perception ,HEARING ,PERTURBATION theory - Abstract
Shifts in perceptual boundaries resulting from speech motor learning induced by perturbations of the auditory feedback were taken as evidence for the involvement of motor functions in auditory speech perception. Beyond this general statement, the precise mechanisms underlying this involvement are not yet fully understood. In this paper we propose a quantitative evaluation of some hypotheses concerning the motor and auditory updates that could result from motor learning, in the context of various assumptions about the roles of the auditory and somatosensory pathways in speech perception. This analysis was made possible thanks to the use of a Bayesian model that implements these hypotheses by expressing the relationships between speech production and speech perception in a joint probability distribution. The evaluation focuses on how the hypotheses can (1) predict the location of perceptual boundary shifts once the perturbation has been removed, (2) account for the magnitude of the compensation in presence of the perturbation, and (3) describe the correlation between these two behavioral characteristics. Experimental findings about changes in speech perception following adaptation to auditory feedback perturbations serve as reference. Simulations suggest that they are compatible with a framework in which motor adaptation updates both the auditory-motor internal model and the auditory characterization of the perturbed phoneme, and where perception involves both auditory and somatosensory pathways. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
43. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak.
- Author
-
Lau, Max S. Y., Gibson, Gavin J., Adrakey, Hola, McClelland, Amanda, Riley, Steven, Zelner, Jon, Streftaris, George, Funk, Sebastian, Metcalf, Jessica, Dalziel, Benjamin D., and Grenfell, Bryan T.
- Subjects
EBOLA virus ,SPATIO-temporal variation ,EBOLA viral disease transmission ,NEGATIVE-strand RNA viruses - Abstract
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
44. Clusternomics: Integrative context-dependent clustering for heterogeneous datasets.
- Author
-
Gabasova, Evelina, Reid, John, and Wernisch, Lorenz
- Subjects
GENE expression ,DNA copy number variations ,MICRORNA ,DNA methylation ,PROTEOMICS - Abstract
Integrative clustering is used to identify groups of samples by jointly analysing multiple datasets describing the same set of biological samples, such as gene expression, copy number, methylation etc. Most existing algorithms for integrative clustering assume that there is a shared consistent set of clusters across all datasets, and most of the data samples follow this structure. However in practice, the structure across heterogeneous datasets can be more varied, with clusters being joined in some datasets and separated in others. In this paper, we present a probabilistic clustering method to identify groups across datasets that do not share the same cluster structure. The proposed algorithm, Clusternomics, identifies groups of samples that share their global behaviour across heterogeneous datasets. The algorithm models clusters on the level of individual datasets, while also extracting global structure that arises from the local cluster assignments. Clusters on both the local and the global level are modelled using a hierarchical Dirichlet mixture model to identify structure on both levels. We evaluated the model both on simulated and on real-world datasets. The simulated data exemplifies datasets with varying degrees of common structure. In such a setting Clusternomics outperforms existing algorithms for integrative and consensus clustering. In a real-world application, we used the algorithm for cancer subtyping, identifying subtypes of cancer from heterogeneous datasets. We applied the algorithm to TCGA breast cancer dataset, integrating gene expression, miRNA expression, DNA methylation and proteomics. The algorithm extracted clinically meaningful clusters with significantly different survival probabilities. We also evaluated the algorithm on lung and kidney cancer TCGA datasets with high dimensionality, again showing clinically significant results and scalability of the algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
45. The role of glutamate in neuronal ion homeostasis: A case study of spreading depolarization.
- Author
-
Hübel, Niklas, Hosseini-Zare, Mahshid S., Žiburkus, Jokūbas, and Ullah, Ghanim
- Subjects
GLUTAMIC acid ,HOMEOSTASIS ,NEUROCHEMISTRY ,NEUROTRANSMITTERS ,CYTOLOGY ,NEURONS ,THERAPEUTICS - Abstract
Simultaneous changes in ion concentrations, glutamate, and cell volume together with exchange of matter between cell network and vasculature are ubiquitous in numerous brain pathologies. A complete understanding of pathological conditions as well as normal brain function, therefore, hinges on elucidating the molecular and cellular pathways involved in these mostly interdependent variations. In this paper, we develop the first computational framework that combines the Hodgkin–Huxley type spiking dynamics, dynamic ion concentrations and glutamate homeostasis, neuronal and astroglial volume changes, and ion exchange with vasculature into a comprehensive model to elucidate the role of glutamate uptake in the dynamics of spreading depolarization (SD)—the electrophysiological event underlying numerous pathologies including migraine, ischemic stroke, aneurysmal subarachnoid hemorrhage, intracerebral hematoma, and trauma. We are particularly interested in investigating the role of glutamate in the duration and termination of SD caused by K
+ perfusion and oxygen-glucose deprivation. Our results demonstrate that glutamate signaling plays a key role in the dynamics of SD, and that impaired glutamate uptake leads to recovery failure of neurons from SD. We confirm predictions from our model experimentally by showing that inhibiting astrocytic glutamate uptake using TFB-TBOA nearly quadruples the duration of SD in layers 2-3 of visual cortical slices from juvenile rats. The model equations are either derived purely from first physical principles of electroneutrality, osmosis, and conservation of particles or a combination of these principles and known physiological facts. Accordingly, we claim that our approach can be used as a future guide to investigate the role of glutamate, ion concentrations, and dynamics cell volume in other brain pathologies and normal brain function. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
46. PCSF: An R-package for network-based interpretation of high-throughput data.
- Author
-
Akhmedov, Murodzhon, Kedaigle, Amanda, Chong, Renan Escalante, Montemanni, Roberto, Bertoni, Francesco, Fraenkel, Ernest, and Kwee, Ivo
- Subjects
BIOINFORMATICS software ,DATA analysis software ,MATHEMATICAL optimization ,COMPUTATIONAL biology ,PROTEIN-protein interactions - Abstract
With the recent technological developments a vast amount of high-throughput data has been profiled to understand the mechanism of complex diseases. The current bioinformatics challenge is to interpret the data and underlying biology, where efficient algorithms for analyzing heterogeneous high-throughput data using biological networks are becoming increasingly valuable. In this paper, we propose a software package based on the Prize-collecting Steiner Forest graph optimization approach. The PCSF package performs fast and user-friendly network analysis of high-throughput data by mapping the data onto a biological networks such as protein-protein interaction, gene-gene interaction or any other correlation or coexpression based networks. Using the interaction networks as a template, it determines high-confidence subnetworks relevant to the data, which potentially leads to predictions of functional units. It also interactively visualizes the resulting subnetwork with functional enrichment analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
47. Fast and general tests of genetic interaction for genome-wide association studies.
- Author
-
Frånberg, Mattias, Strawbridge, Rona J., Hamsten, Anders, null, null, de Faire, Ulf, Lagergren, Jens, and Sennblad, Bengt
- Subjects
GENOMES ,DISEASES ,CORONARY disease ,PHYSICAL sciences ,CARDIOLOGY - Abstract
A complex disease has, by definition, multiple genetic causes. In theory, these causes could be identified individually, but their identification will likely benefit from informed use of anticipated interactions between causes. In addition, characterizing and understanding interactions must be considered key to revealing the etiology of any complex disease. Large-scale collaborative efforts are now paving the way for comprehensive studies of interaction. As a consequence, there is a need for methods with a computational efficiency sufficient for modern data sets as well as for improvements of statistical accuracy and power. Another issue is that, currently, the relation between different methods for interaction inference is in many cases not transparent, complicating the comparison and interpretation of results between different interaction studies. In this paper we present computationally efficient tests of interaction for the complete family of generalized linear models (GLMs). The tests can be applied for inference of single or multiple interaction parameters, but we show, by simulation, that jointly testing the full set of interaction parameters yields superior power and control of false positive rate. Based on these tests we also describe how to combine results from multiple independent studies of interaction in a meta-analysis. We investigate the impact of several assumptions commonly made when modeling interactions. We also show that, across the important class of models with a full set of interaction parameters, jointly testing the interaction parameters yields identical results. Further, we apply our method to genetic data for cardiovascular disease. This allowed us to identify a putative interaction involved in Lp(a) plasma levels between two ‘tag’ variants in the LPA locus (p = 2.42 ⋅ 10
−09 ) as well as replicate the interaction (p = 6.97 ⋅ 10−07 ). Finally, our meta-analysis method is used in a small (N = 16,181) study of interactions in myocardial infarction. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
48. Classification and adaptive behavior prediction of children with autism spectrum disorder based upon multivariate data analysis of markers of oxidative stress and DNA methylation.
- Author
-
Howsmon, Daniel P., Kruger, Uwe, Melnyk, Stepan, James, S. Jill, and Hahn, Juergen
- Subjects
AUTISTIC children ,ADAPTABILITY (Personality) ,MULTIVARIATE analysis ,OXIDATIVE stress ,DNA methylation - Abstract
The number of diagnosed cases of Autism Spectrum Disorders (ASD) has increased dramatically over the last four decades; however, there is still considerable debate regarding the underlying pathophysiology of ASD. This lack of biological knowledge restricts diagnoses to be made based on behavioral observations and psychometric tools. However, physiological measurements should support these behavioral diagnoses in the future in order to enable earlier and more accurate diagnoses. Stepping towards this goal of incorporating biochemical data into ASD diagnosis, this paper analyzes measurements of metabolite concentrations of the folate-dependent one-carbon metabolism and transulfuration pathways taken from blood samples of 83 participants with ASD and 76 age-matched neurotypical peers. Fisher Discriminant Analysis enables multivariate classification of the participants as on the spectrum or neurotypical which results in 96.1% of all neurotypical participants being correctly identified as such while still correctly identifying 97.6% of the ASD cohort. Furthermore, kernel partial least squares is used to predict adaptive behavior, as measured by the Vineland Adaptive Behavior Composite score, where measurement of five metabolites of the pathways was sufficient to predict the Vineland score with an R
2 of 0.45 after cross-validation. This level of accuracy for classification as well as severity prediction far exceeds any other approach in this field and is a strong indicator that the metabolites under consideration are strongly correlated with an ASD diagnosis but also that the statistical analysis used here offers tremendous potential for extracting important information from complex biochemical data sets. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
49. graph-GPA: A graphical model for prioritizing GWAS results and investigating pleiotropic architecture.
- Author
-
Chung, Dongjun, Kim, Hang J., and Zhao, Hongyu
- Subjects
GENOMES ,PHENOTYPES ,MEDICAL genetics ,GENETIC pleiotropy ,MARKOV random fields - Abstract
Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at . [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
50. Bayesian phylogeography of influenza A/H3N2 for the 2014-15 season in the United States using three frameworks of ancestral state reconstruction.
- Author
-
Magee, Daniel, Suchard, Marc A., and Scotch, Matthew
- Subjects
INFLUENZA A virus, H3N2 subtype ,BAYESIAN analysis ,PHYLOGEOGRAPHY ,PANDEMICS - Abstract
Ancestral state reconstruction in Bayesian phylogeography of virus pandemics have been improved by utilizing a Bayesian stochastic search variable selection (BSSVS) framework. Recently, this framework has been extended to model the transition rate matrix between discrete states as a generalized linear model (GLM) of genetic, geographic, demographic, and environmental predictors of interest to the virus and incorporating BSSVS to estimate the posterior inclusion probabilities of each predictor. Although the latter appears to enhance the biological validity of ancestral state reconstruction, there has yet to be a comparison of phylogenies created by the two methods. In this paper, we compare these two methods, while also using a primitive method without BSSVS, and highlight the differences in phylogenies created by each. We test six coalescent priors and six random sequence samples of H3N2 influenza during the 2014–15 flu season in the U.S. We show that the GLMs yield significantly greater root state posterior probabilities than the two alternative methods under five of the six priors, and significantly greater Kullback-Leibler divergence values than the two alternative methods under all priors. Furthermore, the GLMs strongly implicate temperature and precipitation as driving forces of this flu season and nearly unanimously identified a single root state, which exhibits the most tropical climate during a typical flu season in the U.S. The GLM, however, appears to be highly susceptible to sampling bias compared with the other methods, which casts doubt on whether its reconstructions should be favored over those created by alternate methods. We report that a BSSVS approach with a Poisson prior demonstrates less bias toward sample size under certain conditions than the GLMs or primitive models, and believe that the connection between reconstruction method and sampling bias warrants further investigation. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.