Back to Search Start Over

Action anticipation for collaborative environments: The impact of contextual information and uncertainty-based prediction.

Authors :
Canuto, Clebeson
Moreno, Plinio
Samatelo, Jorge
Vassallo, Raquel
Santos-Victor, José
Source :
Neurocomputing. Jul2021, Vol. 444, p301-318. 18p.
Publication Year :
2021

Abstract

To interact with humans in collaborative environments, machines need to be able to predict (i.e., anticipate) future events, and execute actions in a timely manner. However, the observation of the human limb movements may not be sufficient to anticipate their actions unambiguously. In this work, we consider two additional sources of information (i.e., context) over time, gaze, movement and object information, and study how these additional contextual cues improve the action anticipation performance. We address action anticipation as a classification task, where the model takes the available information as the input and predicts the most likely action. We propose to use the uncertainty about each prediction as an online decision-making criterion for action anticipation. Uncertainty is modeled as a stochastic process applied to a time-based neural network architecture, which improves the conventional class-likelihood (i.e., deterministic) criterion. The main contributions of this paper are fourfold: (i) We propose a novel and effective decision-making criterion that can be used to anticipate actions even in situations of high ambiguity; (ii) we propose a deep architecture that outperforms previous results in the action anticipation task when using the Acticipate collaborative dataset; (iii) we show that contextual information is important to disambiguate the interpretation of similar actions; and (iv) we also provide a formal description of three existing performance metrics that can be easily used to evaluate action anticipation models. Our results on the Acticipate dataset showed the importance of contextual information and the uncertainty criterion for action anticipation. We achieve an average accuracy of 98.75 % in the anticipation task using only an average of 25 % of observations. Also, considering that a good anticipation model should perform well in the action recognition task, we achieve an average accuracy of 100 % in action recognition on the Acticipate dataset, when the entire observation set is used. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
444
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
150171601
Full Text :
https://doi.org/10.1016/j.neucom.2020.07.135