Proceedings of 3rd International Conference on Multimedia Technology(ICMT-13)

Manifold Learning Method for Large Scale Dataset Based on Gradient Descent

Authors
Wang Yunhe, Gao Yuan, Xu Chao
Corresponding Author
Wang Yunhe
Available Online November 2013.
DOI
https://doi.org/10.2991/icmt-13.2013.145How to use a DOI?
Keywords
manifold learning; LLE; gradient descent; time complexity.
Abstract
Dimension reduction is a research hotspot in recent years, especially manifold learning for high-dimensional data. Cause high-dimensional data have complex nonlinear structures there are many researchers focus on nonlinear methods. The memory cost and running time are too large and difficult to operate when the scales of data are tremendously large. In order to solve this problem, we utilized the gradient descent to search the low-dimensional embedding. It replaced the eigenvalue decomposition of a large sparse matrix of LLE (Linear Locally Embedding) algorithm. The time complexity is lower than before and storage memory is declined obviously. Experimental results demonstrated our approach performed well than the original algorithm. Furthermore, our approach can be applied to other manifold method or other research fields such as information retrieval and feature extraction.
Open Access
This is an open access article distributed under the CC BY-NC license.

Download article (PDF)

Proceedings
3rd International Conference on Multimedia Technology(ICMT-13)
Part of series
Advances in Intelligent Systems Research
Publication Date
November 2013
ISBN
978-90-78677-89-5
ISSN
1951-6851
DOI
https://doi.org/10.2991/icmt-13.2013.145How to use a DOI?
Open Access
This is an open access article distributed under the CC BY-NC license.

Cite this article

TY  - CONF
AU  - Wang Yunhe
AU  - Gao Yuan
AU  - Xu Chao
PY  - 2013/11
DA  - 2013/11
TI  - Manifold Learning Method for Large Scale Dataset Based on Gradient Descent
BT  - 3rd International Conference on Multimedia Technology(ICMT-13)
PB  - Atlantis Press
SN  - 1951-6851
UR  - https://doi.org/10.2991/icmt-13.2013.145
DO  - https://doi.org/10.2991/icmt-13.2013.145
ID  - Yunhe2013/11
ER  -