Classification Analysis of the Auditory Cognitive States for Sighted and Blind People Based on a Whole-brain "Searchlight" SVM
- 10.2991/acaai-18.2018.20How to use a DOI?
- sound identification; sound localization; blind; functional magnetic resonance imaging; support vector machine
When talking about auditory perception, it remains unclear how people could achieve sound identification and localization efficiently even in the middle of chaos. Though the dual-pathway model is widely held, there are still controversies in specific brain regions. Moreover, some machine learning methods are successfully applied to study neural pattern information in our brain. In this study, both sighted and blind subjects were recruited and instructed to perform a sound localization and identification task. Functional magnetic resonance imaging (fMRI) data were collected, then we used a whole brain "searchlight" support vector machine (SVM) to decode the pattern information successfully. We identified the distinct cortical networks encoding sound pattern and spatial information, and we also found a large-scale cortical reorganization in blind people from a multi-voxel perspective.
- © 2018, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Jianlong Zheng AU - Junhai Xu PY - 2018/03 DA - 2018/03 TI - Classification Analysis of the Auditory Cognitive States for Sighted and Blind People Based on a Whole-brain "Searchlight" SVM BT - Proceedings of the 2018 International Conference on Advanced Control, Automation and Artificial Intelligence (ACAAI 2018) PB - Atlantis Press SP - 80 EP - 82 SN - 1951-6851 UR - https://doi.org/10.2991/acaai-18.2018.20 DO - 10.2991/acaai-18.2018.20 ID - Zheng2018/03 ER -