Back to Search Start Over

Learn to Predict Sets Using Feed-Forward Neural Networks.

Authors :
Rezatofighi, Hamid
Zhu, Tianyu
Kaskman, Roman
Motlagh, Farbod T.
Shi, Javen Qinfeng
Milan, Anton
Cremers, Daniel
Leal-Taixe, Laura
Reid, Ian
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Dec2022, Vol. 44 Issue 12, p9011-9025. 15p.
Publication Year :
2022

Abstract

This paper addresses the task of set prediction using deep feed-forward neural networks. A set is a collection of elements which is invariant under permutation and the size of a set is not fixed in advance. Many real-world problems, such as image tagging and object detection, have outputs that are naturally expressed as sets of entities. This creates a challenge for traditional deep neural networks which naturally deal with structured outputs such as vectors, matrices or tensors. We present a novel approach for learning to predict sets with unknown permutation and cardinality using deep neural networks. In our formulation we define a likelihood for a set distribution represented by a) two discrete distributions defining the set cardinally and permutation variables, and b) a joint distribution over set elements with a fixed cardinality. Depending on the problem under consideration, we define different training models for set prediction using deep neural networks. We demonstrate the validity of our set formulations on relevant vision problems such as: 1) multi-label image classification where we outperform the other competing methods on the PASCAL VOC and MS COCO datasets, 2) object detection, for which our formulation outperforms popular state-of-the-art detectors, and 3) a complex CAPTCHA test, where we observe that, surprisingly, our set-based network acquired the ability of mimicking arithmetics without any rules being coded. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01628828
Volume :
44
Issue :
12
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
160650735
Full Text :
https://doi.org/10.1109/TPAMI.2021.3122970