Back to Search
Start Over
Consistency Models for Scalable and Fast Simulation-Based Inference
- Source :
- Neural Information Processing Systems (NeurIPS 2024)
- Publication Year :
- 2023
-
Abstract
- Simulation-based inference (SBI) is constantly in search of more expressive and efficient algorithms to accurately infer the parameters of complex simulation models. In line with this goal, we present consistency models for posterior estimation (CMPE), a new conditional sampler for SBI that inherits the advantages of recent unconstrained architectures and overcomes their sampling inefficiency at inference time. CMPE essentially distills a continuous probability flow and enables rapid few-shot inference with an unconstrained architecture that can be flexibly tailored to the structure of the estimation problem. We provide hyperparameters and default architectures that support consistency training over a wide range of different dimensions, including low-dimensional ones which are important in SBI workflows but were previously difficult to tackle even with unconditional consistency models. Our empirical evaluation demonstrates that CMPE not only outperforms current state-of-the-art algorithms on hard low-dimensional benchmarks, but also achieves competitive performance with much faster sampling speed on two realistic estimation problems with high data and/or parameter dimensions.
Details
- Database :
- arXiv
- Journal :
- Neural Information Processing Systems (NeurIPS 2024)
- Publication Type :
- Report
- Accession number :
- edsarx.2312.05440
- Document Type :
- Working Paper