Back to Search
Start Over
A Dynamic Frame Selection Framework for Fast Video Recognition.
- Source :
- IEEE Transactions on Pattern Analysis & Machine Intelligence; Apr2022, Vol. 44 Issue 4, p1699-1711, 13p
- Publication Year :
- 2022
-
Abstract
- We introduce AdaFrame, a conditional computation framework that adaptively selects relevant frames on a per-input basis for fast video recognition. AdaFrame, which contains a Long Short-Term Memory augmented with a global memory to provide context information, operates as an agent to interact with video sequences aiming to search over time which frames to use. Trained with policy search methods, at each time step, AdaFrame computes a prediction, decides where to observe next, and estimates a utility, i.e., expected future rewards, of viewing more frames in the future. Exploring predicted utilities at testing time, AdaFrame is able to achieve adaptive lookahead inference so as to minimize the overall computational cost without incurring a degradation in accuracy. We conduct extensive experiments on two large-scale video benchmarks, FCVID and ActivityNet. With a vanilla ResNet-101 model, AdaFrame achieves similar performance of using all frames while only requiring, on average, 8.21 and 8.65 frames on FCVID and ActivityNet, respectively. We also demonstrate AdaFrame is compatible with modern 2D and 3D networks for video recognition. Furthermore, we show, among other things, learned frame usage can reflect the difficulty of making prediction decisions both at instance-level within the same class and at class-level among different categories. [ABSTRACT FROM AUTHOR]
- Subjects :
- REWARD (Psychology)
FRAMES (Social sciences)
DECISION making
REINFORCEMENT learning
Subjects
Details
- Language :
- English
- ISSN :
- 01628828
- Volume :
- 44
- Issue :
- 4
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Pattern Analysis & Machine Intelligence
- Publication Type :
- Academic Journal
- Accession number :
- 155735831
- Full Text :
- https://doi.org/10.1109/TPAMI.2020.3029425