1. Performance assessment of a system for reasoning under uncertainty
- Author
-
Marion Byrne, Branko Ristic, and Christopher Gilliam
- Subjects
Ground truth ,Computer science ,business.industry ,020206 networking & telecommunications ,Context (language use) ,02 engineering and technology ,Machine learning ,computer.software_genre ,Imprecise probability ,Measure (mathematics) ,Rotation formalisms in three dimensions ,Hardware and Architecture ,Signal Processing ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,Focus (optics) ,business ,computer ,Software ,Randomness ,Information Systems ,Possibility theory - Abstract
From the early developments of machines for reasoning and decision making in higher-level information fusion, there was a need for a systematic and reliable evaluation of their performance. Performance evaluation is important for comparison and assessment of alternative solutions to real-world problems. In this paper we focus on one aspect of performance assessment for reasoning under uncertainty: the accuracy of the resulting belief (prediction or estimate). We propose a framework for assessment based on the assumption that the system under investigation is uncertain only due to stochastic variability (randomness), which is partially known. In this context we formulate a distance measure between the “ground truth” and the output of an automated system for reasoning in the framework of one of the non-additive uncertainty formalisms (such as imprecise probability theory, belief function theory or possibility theory). The proposed assessment framework is demonstrated with a simple numerical example.
- Published
- 2021
- Full Text
- View/download PDF