5 results on '"Calderhead, Ben"'
Search Results
2. A general construction for parallelizing Metropolis-Hastings algorithms.
- Author
-
Calderhead, Ben
- Subjects
- *
ALGORITHMS , *MARKOV chain Monte Carlo , *COMPUTER simulation , *STATISTICS , *NUMERICAL analysis - Abstract
Markov chain Monte Carlo methods (MCMC) are essential tools for solving many modern-day statistical and computational problems; however, a major limitation is the inherently sequential nature of these algorithms. In this paper, we propose a natural generalization of the Metropolis-Hastings algorithm that allows for parallelizing a single chain using existing MCMC methods. We do so by proposing multiple points in parallel, then constructing and sampling from a finite-state Markov chain on the proposed points such that the overall procedure has the correct target density as its stationary distribution. Our approach is generally applicable and straightforward to implement. We demonstrate how this construction may be used to greatly increase the computational speed and statistical efficiency of a variety of existing MCMC methods, including Metropolis-Adjusted Langevin Algorithms and Adaptive MCMC. Furthermore, we show how it allows for a principled way of using every integration step within Hamiltonian Monte Carlo methods; our approach increases robustness to the choice of algorithmic parameters and results in increased accuracy of Monte Carlo estimates with little extra computational cost. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
3. Hamiltonian Monte Carlo methods for efficient parameter estimation in steady state dynamical systems.
- Author
-
Kramer, Andrei, Calderhead, Ben, and Radde, Nicole
- Abstract
Background: Parameter estimation for differential equation models of intracellular processes is a highly relevant bu challenging task. The available experimental data do not usually contain enough information to identify all parameters uniquely, resulting in ill-posed estimation problems with often highly correlated parameters. Sampling-based Bayesian statistical approaches are appropriate for tackling this problem. The samples are typically generated via Markov chain Monte Carlo, however such methods are computationally expensive and their convergence may be slow, especially if there are strong correlations between parameters. Monte Carlo methods based on Euclidean or Riemannian Hamiltonian dynamics have been shown to outperform other samplers by making proposal moves that take the local sensitivities of the system’s states into account and accepting these moves with high probability. However, the high computational cost involved with calculating the Hamiltonian trajectories prevents their widespread use for all but the smallest differential equation models. The further development of efficient sampling algorithms is therefore an important step towards improving the statistical analysis of predictive models of intracellular processes. Results: We show how state of the art Hamiltonian Monte Carlo methods may be significantly improved for steady state dynamical models. We present a novel approach for efficiently calculating the required geometric quantities by tracking steady states across the Hamiltonian trajectories using a Newton-Raphson method and employing local sensitivity information. Using our approach, we compare both Euclidean and Riemannian versions of Hamiltonian Monte Carlo on three models for intracellular processes with real data and demonstrate at least an order of magnitude improvement in the effective sampling speed. We further demonstrate the wider applicability of our approach to other gradient based MCMC methods, such as those based on Langevin diffusions. Conclusion: Our approach is strictly benefitial in all test cases. The Matlab sources implementing our MCMC methodology is available from https://github.com/a-kramer/ode_rmhmc. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
4. Ten Simple Rules for a Successful Cross-Disciplinary Collaboration.
- Author
-
Knapp, Bernhard, Bardenet, Rémi, Bernabeu, Miguel O., Bordas, Rafel, Bruna, Maria, Calderhead, Ben, Cooper, Jonathan, Fletcher, Alexander G., Groen, Derek, Kuijper, Bram, Lewis, Joanna, McInerny, Greg, Minssen, Timo, Osborne, James, Paulitschke, Verena, Pitt-Francis, Joe, Todoric, Jelena, Yates, Christian A., Gavaghan, David, and Deane, Charlotte M.
- Subjects
- *
EXPERIMENTAL biology , *INFORMATION technology research , *INTERDISCIPLINARY research , *LIFE sciences , *PEER review of research grant proposals - Abstract
The article presents ten simple rules for a successful cross-disciplinary collaboration. Topics discussed include research in experimental biology, need of multidisciplinary collaborations, need to recognize the publication culture in the life sciences and in experimental biology, and publication of information technology research in peer-reviewed conference proceedings instead of journals.
- Published
- 2015
- Full Text
- View/download PDF
5. Changing How Earth System Modeling is Done to Provide More Useful Information for Decision Making, Science, and Society.
- Author
-
Smith, Matthew J., Palmer, Paul I., Purves, Drew W., Vanderwel, Mark C., Lyutsarev, Vassily, Calderhead, Ben, Joppa, Lucas N., Bishop, Christopher M., and Emmott, Stephen
- Subjects
- *
EARTH (Planet) , *REALISM , *EFFECT of human beings on weather , *DECISION making , *SCIENCE - Abstract
New details about natural and anthropogenic processes are continually added to models of the Earth system, anticipating that the increased realism will increase the accuracy of their predictions. However, perspectives differ about whether this approach will improve the value of the information the models provide to decision makers, scientists, and societies. The present bias toward increasing realism leads to a range of updated projections, but at the expense of uncertainty quantification and model tractability. This bias makes it difficult to quantify the uncertainty associated with the projections from any one model or to the distribution of projections from different models. This in turn limits the utility of climate model outputs for deriving useful information such as in the design of effective climate change mitigation and adaptation strategies or identifying and prioritizing sources of uncertainty for reduction. Here we argue that a new approach to model development is needed, focused on the delivery of information to support specific policy decisions or science questions. The central tenet of this approach is the assessment and justification of the overall balance of model detail that reflects the question posed, current knowledge, available data, and sources of uncertainty. This differs from contemporary practices by explicitly seeking to quantify both the benefits and costs of details at a systemic level, taking into account the precision and accuracy with which predictions are made when compared to existing empirical evidence. We specify changes to contemporary model development practices that would help in achieving this goal. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.