Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering (NCCE 2018)

Human Action Recognition Based on Deep Images and Dense Trajectories

Authors
Xiaopeng Cui, Binwen Fan, Jingyu Yi
Corresponding Author
Xiaopeng Cui
Available Online May 2018.
DOI
10.2991/ncce-18.2018.169How to use a DOI?
Keywords
Human Action Recognition; Depth Map; Dense Trajectories; SVM.
Abstract

The main content of this paper is to implement a human action recognition method based on depth image and dense trajectories. Firstly, the binocular RGB camera is used to collect images, and then the depth image is obtained through stereo matching algorithm. We use depth images to extract human action sequences. Then we choose dense optical flow field to calculate the trajectory of human sequence. After that, we compare the HOG method with MBH method and HOF method. Finally, we use SVM to complete the recognition of human actions.

Copyright
© 2018, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Download article (PDF)

Volume Title
Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering (NCCE 2018)
Series
Advances in Intelligent Systems Research
Publication Date
May 2018
ISBN
10.2991/ncce-18.2018.169
ISSN
1951-6851
DOI
10.2991/ncce-18.2018.169How to use a DOI?
Copyright
© 2018, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Cite this article

TY  - CONF
AU  - Xiaopeng Cui
AU  - Binwen Fan
AU  - Jingyu Yi
PY  - 2018/05
DA  - 2018/05
TI  - Human Action Recognition Based on Deep Images and Dense Trajectories
BT  - Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering (NCCE 2018)
PB  - Atlantis Press
SP  - 1012
EP  - 1018
SN  - 1951-6851
UR  - https://doi.org/10.2991/ncce-18.2018.169
DO  - 10.2991/ncce-18.2018.169
ID  - Cui2018/05
ER  -