Back to Search Start Over

Adaptive data augmentation network for human pose estimation.

Authors :
Wang, Dong
Xie, Wenjun
Cai, Youcheng
Liu, Xiaoping
Source :
Digital Signal Processing. Sep2022, Vol. 129, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

With the rapid development of convolutional neural networks (CNNs), the performance of human pose estimation has been significantly improved. However, state-of-the-art methods still face specific challenges, such as occluded keypoints and nearby persons. Unlike the classical network improvement strategy, our approach obtains adaptive data to obtain more accurate keypoints through data augmentation. In this paper, we propose Adaptive Data Augmentation Network (ADA-Net), which brings more adaptive training data by adding occlusion and interleaving information to the original image. First, we introduce Active Transmission Network (ATNet), which actively learns the transformation matrix from the original image and uses the matrix to synthesize new images for training. Second, we adopt an adversarial training strategy combined with the ATNet that allows us to capture more challenging cases. Extensive experiments show that our approach achieves comparable or even better results than most state-of-the-art methods. In particular, our ADA-Net outperforms High-Resolution Network (HRNet) by 1.1 and 0.7 points on the COCO test-dev set and MPII validation set, respectively. • The diversity of training samples will enhance the performance of human pose estimation. • The appropriate number of pasted samples on the original image can affect network performance. • Pasting the complete person is beneficial in enhancing the original image information. • The generative adversarial network brings the synthetic image closer to real challenging cases. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10512004
Volume :
129
Database :
Academic Search Index
Journal :
Digital Signal Processing
Publication Type :
Periodical
Accession number :
158817713
Full Text :
https://doi.org/10.1016/j.dsp.2022.103681