Back to Search
Start Over
Human uncertainty in explicit user feedback and its impact on the comparative evaluations of accurate prediction and personalisation.
- Source :
-
Behaviour & Information Technology . May2020, Vol. 39 Issue 5, p544-577. 34p. 1 Color Photograph, 4 Diagrams, 6 Charts, 17 Graphs. - Publication Year :
- 2020
-
Abstract
- In this article, we report on the lack of reliability of explicit user feedback and its interpretation in the light of system evaluation. It is known, that given feedback strongly depends on the situational context. But also when many contextual factors are held constant, user feedback still proves to be unreliable. This impacts the evaluation of predictive algorithms since it is not clear whether a deviation between a user response and its corresponding prediction can be seen as a flaw by the system or just as usual 'human uncertainty'. As a result, the perspective on the evaluation of adaptive systems basically changes. The main goal of this article is to demonstrate that simply increasing the amount of explicit feedback is not the key to sustainable system design innovation, as long as that information is not appropriately evaluated. To this end, we will exploit a novel probabilistic approach of processing user feedback and identify biasing effects on accuracy metrics, error probabilities for system rankings as well as natural limitations of evaluation. Finally, we will discuss possible solution strategies and give advice for handling explicit user feedback that is associated with uncertainty. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 0144929X
- Volume :
- 39
- Issue :
- 5
- Database :
- Academic Search Index
- Journal :
- Behaviour & Information Technology
- Publication Type :
- Academic Journal
- Accession number :
- 142554548
- Full Text :
- https://doi.org/10.1080/0144929X.2019.1604804