Human Pose Estimation Based on Improved Hourglass Networks
- 10.2991/cnci-19.2019.82How to use a DOI?
- Human pose estimation, deep learning, hourglass network, ResNeXt network, compre ssed network model.
In the problem of human pose estimation of monocular still images, the use of the hourglass network method to solve the human pose estimation problem has become the mainstream, because the model increases the receptive field, so that the model can obtain more context-related information. However, this model only focuses on performance improvement and does not take into consideration the problem of increased model complexity. In this paper, the ResNeXt module is presented as the basic component of the hourglass network for the first time. The purpose is to compress the hourglass network and reduce the redundant parameters, so as to design the ResNeXt module as the building block sub-heavy hourglass network for human pose estimation. To capture the multi-scale interdependence between body joints in a pose model. The network returns to the human body joint points in the form of heat maps, and uses implicit modeling methods to learn the spatial constraints between the various joints of the human body. Finally, the model is measured on the two benchmark datasets of MPII and LSP. The network model designed in this paper is easy to in structure, the parameter quantity is reduced, and the test performance is equivalent to the performance of the existing advanced technology.
- © 2019, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Xiaojun Bi AU - Xuelian Zou PY - 2019/05 DA - 2019/05 TI - Human Pose Estimation Based on Improved Hourglass Networks BT - Proceedings of the 2019 International Conference on Computer, Network, Communication and Information Systems (CNCI 2019) PB - Atlantis Press SP - 593 EP - 601 SN - 2352-538X UR - https://doi.org/10.2991/cnci-19.2019.82 DO - 10.2991/cnci-19.2019.82 ID - Bi2019/05 ER -