Back to Search Start Over

IRIS: Interpretable Rubric-Informed Segmentation for Action Quality Assessment

Authors :
Matsuyama, Hitoshi
Kawaguchi, Nobuo
Lim, Brian Y.
Publication Year :
2023

Abstract

AI-driven Action Quality Assessment (AQA) of sports videos can mimic Olympic judges to help score performances as a second opinion or for training. However, these AI methods are uninterpretable and do not justify their scores, which is important for algorithmic accountability. Indeed, to account for their decisions, instead of scoring subjectively, sports judges use a consistent set of criteria - rubric - on multiple actions in each performance sequence. Therefore, we propose IRIS to perform Interpretable Rubric-Informed Segmentation on action sequences for AQA. We investigated IRIS for scoring videos of figure skating performance. IRIS predicts (1) action segments, (2) technical element score differences of each segment relative to base scores, (3) multiple program component scores, and (4) the summed final score. In a modeling study, we found that IRIS performs better than non-interpretable, state-of-the-art models. In a formative user study, practicing figure skaters agreed with the rubric-informed explanations, found them useful, and trusted AI judgments more. This work highlights the importance of using judgment rubrics to account for AI decisions.<br />Comment: 28th International Conference on Intelligent User Interfaces (IUI 2023)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2303.09097
Document Type :
Working Paper