生物医学工程学杂志

生物医学工程学杂志

基于深度信念网络脑电信号表征情绪状态的识别研究

查看全文

随着机器学习技术的快速发展,深度学习等系列算法在一维生理信号处理方面得到了广泛的应用。本文针对脑电(EEG)信号,使用深度学习开源框架中的深度信念网络(DBN)模型识别积极、消极、中性 3 种情绪状态,并与支持向量机(SVM)进行识别效率的对比,通过采集受试者在不同情绪刺激状态下的脑电信号,利用深度信念网络和支持向量机分别对基于不同特征变换和不同频段的情绪表征数据进行识别。研究结果发现,利用深度信念网络对差分熵(DE)特征进行识别的平均准确率为 89.12%±6.54%,与之前的研究相比在同一批数据集上的识别效果更好,同时深度信念网络的分类效果在数值上好于传统的支持向量机(平均分类准确率为 84.2%±9.24%),其准确率和稳定性都有相应更好的趋势,另外受试者在 3 次重复试验中都能得到比较一致的分类准确率(标准差的平均值为 1.44%),试验结果较为稳定,试验具有一定的可重复性。研究结果显示,差分熵特征相比于其他特征在分类器中有着更好的分类准确率,此外,方法中使用 Beta 频段和 Gamma 频段在情绪识别模型中有着更好的分类效果。综上所述,利用深度学习算法进行情绪识别,能够在准确率上有所提升,对于建立能够更准确地识别情绪状态的辅助识别系统有着一定的借鉴意义。此外,本文研究结果进一步提示可以通过分类结果反演找出与情绪状态最相关的脑区和频段,从而加深对于情绪机制的理解,因此本文在利用脑电信号表征情绪状态的识别研究领域具有一定的学术价值和应用价值,值得更深入的探究。

In recent years, with the rapid development of machine learning techniques,the deep learning algorithm has been widely used in one-dimensional physiological signal processing. In this paper we used electroencephalography (EEG) signals based on deep belief network (DBN) model in open source frameworks of deep learning to identify emotional state (positive, negative and neutrals), then the results of DBN were compared with support vector machine (SVM). The EEG signals were collected from the subjects who were under different emotional stimuli, and DBN and SVM were adopted to identify the EEG signals with changes of different characteristics and different frequency bands. We found that the average accuracy of differential entropy (DE) feature by DBN is 89.12%±6.54%, which has a better performance than previous research based on the same data set. At the same time, the classification effects of DBN are better than the results from traditional SVM (the average classification accuracy of 84.2%±9.24%) and its accuracy and stability have a better trend. In three experiments with different time points, single subject can achieve the consistent results of classification by using DBN (the mean standard deviation is1.44%), and the experimental results show that the system has steady performance and good repeatability. According to our research, the characteristic of DE has a better classification result than other characteristics. Furthermore, the Beta band and the Gamma band in the emotional recognition model have higher classification accuracy. To sum up, the performances of classifiers have a promotion by using the deep learning algorithm, which has a reference for establishing a more accurate system of emotional recognition. Meanwhile, we can trace through the results of recognition to find out the brain regions and frequency band that are related to the emotions, which can help us to understand the emotional mechanism better. This study has a high academic value and practical significance, so further investigation still needs to be done.

关键词: 深度学习; 深度信念网络; 脑电信号; 情绪识别

Key words: deep learning; deep belief network; electroencephalogram; emotional recognition

引用本文: 杨豪, 张俊然, 蒋小梅, 刘飞. 基于深度信念网络脑电信号表征情绪状态的识别研究. 生物医学工程学杂志, 2018, 35(2): 182-190. doi: 10.7507/1001-5515.201706035 复制

登录后 ,请手动点击刷新查看全文内容。 没有账号,
登录后 ,请手动点击刷新查看图表内容。 没有账号,
1. Fletcher D, Scott M. Psychological stress in sports coaches: a review of concepts, research, and practice. J Sports Sci, 2010, 28(2): 127-137.
2. Sutherland M. Generating brain waves that pierce attention. 2005: 1-4. http://www.sutherlandsurvey.com/
3. Krapohl D, Mcmanus B. An objective method for manually scoring polygraph data. Polygraph, 1999, 28(3): 209-222.
4. de Rubeis S, He Xin, Goldberg A P, et al. Synaptic, transcriptional and chromatin genes disrupted in autism. Nature, 2014, 515(7526): 209-215.
5. Hall G C, Szechtman H, Nahmias C. Enhanced salience and emotion recognition in autism: A PET study. American Journal of Psychiatry, 2003, 160(8): 1439-1441.
6. Knyazev G G, Slobodskoj-Plusnin J Y, Bocharov A V. Gender differences in implicit and explicit processing of emotional facial expressions as revealed by event-related theta synchronization. Emotion, 2010, 10(5): 678-687.
7. Sammler D, Grigutsch M, Fritz T, et al. Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology, 2007, 44(2): 293-304.
8. Li Mu, Lu Baoliang. Emotion classification based on gamma-band EEG//2009 Annual international conference of the IEEE engineering in medicine and biology society, Minneapolis, 2009: 1323-1326.
9. Nie D, Wang X W, Shi L C, et al. EEG-based emotion recognition during watching movies//5th International IEEE/EMBS Conference on, Cancun, 2011: 667-670.
10. Wang Xiaowei, Nie Dan, Lu Baoliang. Emotional state classification from EEG data using machine learning approach. Neurocomputing, 2014, 129(SI): 94-106.
11. 蒋小梅, 张俊然, 陈富琴, 等. 基于 J48 决策树分类器的情绪识别与结果分析. 计算机工程与设计, 2017, 38(3): 761-767.
12. 苏建新. 基于脑电信号的情绪识别研究. 南京: 南京邮电大学, 2015.
13. Frantzidis C A, Bratsas C, Klados M A, et al. On the classification of emotional biosignals evoked while viewing affective pictures: an integrated data-mining-based approach for healthcare applications. IEEE Trans Inf Technol Biomed, 2010, 14(2): 309-318.
14. Bengio Y. Learning deep architectures for AI. Foundations and Trends in Machine Learning, 2009, 2(1): 1-127.
15. Braverman M. Poly-logarithmic independence fools bounded-depth booleancircuits. Commun ACM, 2011, 54(4): 108-115.
16. Le Q V. Building high-level features using large scale unsupervised learning//2013 IEEE international conference on acoustics, speech and signal processing (ICASSP), 2013: 8595-8598.
17. Krizhevsky A, Sutskever I, Hinton G E. Imagenet classification with deep convolutional neural networks//Advances in neural information processing systems. 2012: 1097-1105.
18. Hinton G, Deng L, Yu D, et al. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Process Mag, 2012, 29(6): 82-97.
19. Li Kang, Li Xiaoyi, Zhang Yuan, et al. Affective state recognition from EEG with deep belief networks//Bioinformatics and biomedicine (BIBM), 2013 IEEE International Conference on. Shanghai, 2013: 305-310.
20. Wang Dan, Shang Yi. Modeling physiological data with deep belief networks. International journal of information and education technology (IJIET), 2013, 3(5): 505-511.
21. Ghayoumi M, Bansal A K. Emotion in robots using convolutional neural networks//International conference on social robotics, Springer International Publishing. Kansas City, 2016: 285-295.
22. 赵军圣, 庄光明, 王增桂. 极大似然估计方法介绍. 长春理工大学学报:自然科学版, 2010(6): 53-54.
23. Zheng Weilong, Lu Baoliang. Investigating critical frequency bands and channels for EEG-Based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev, 2015, 7(3, SI): 162-175.
24. Soleymani M, Pantic M, Pun T. Multimodal emotion recognition in response to videos. IEEE Transactions on Affective Computing, 2012, 3(2): 211-223.
25. Ioffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv, 2015, arXiv: 1502.03167.
26. Nair V, Hinton G E. Rectified linear units improve restricted Boltzmann machines//Proceedings of the 27th international conference on machine learning (ICML-10). Haifa, 2010: 807-814.
27. Wang X, Fouhey D, Gupta A. Designing deep networks for surface normal estimation//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Boston, 2015: 539-547.
28. Duan Ruonan, Zhu Jiayi, Lu Baoliang. Differential entropy feature for EEG-Based emotion classification//2013 6th International Ieee/Embs Conference On Neural Engineering (NER), 2013: 81-84.
29. 聂聃, 王晓韡, 段若男, 等. 基于脑电的情绪识别研究综述. 中国生物医学工程学报, 2012, 31(4): 595-606.
30. Apez J W. A proposed mechanism of emotion. Archives of Neurology & Psychiatry, 1937, 38(4): 725-743.