Application of Gated Recurrent Unit in Electroencephalogram (EEG)-Based Mental State Classification

  • Gst. Ayu Vida Mastrika Giri Program Studi Informatika, Fakultas Matematika dan Ilmu Pengetahuan Alam, Universitas Udayana
  • Ngurah Agus Sanjaya ER Program Studi Informatika, Fakultas Matematika dan Ilmu Pengetahuan Alam, Universitas Udayana
  • I Ketut Gede Suhartana Program Studi Informatika, Fakultas Matematika dan Ilmu Pengetahuan Alam, Universitas Udayana
Keywords: Deep Learning, Electroencephalogram (EEG), Gated Recurrent Unit (GRU), Mental State Classification

Abstract

The classification of mental states based on electroencephalogram (EEG) recordings has recently gained significant interest in cognitive monitoring and human-computer interaction fields. Due to high signal variability and sensitivity to noise, correct classification is still tricky, even with advances in the analysis of EEG signals. Among deep learning models, Gated Recurrent Unit (GRU) models have established great potential for sequential EEG data analysis. The applications of the GRUs are less reviewed in tasks concerning classification cases of mental states compared to hybrid and convolutional models. Based on this paper, we will propose a method for developing a model based on the GRU network trained with raw EEG data in the classification tasks of mental states of concentration and relaxed conditions. We analyzed 400 EEG recordings taken from 10 subjects within a controlled environment and collected using the Muse EEG Headband.  The mean, standard deviation, skewness, kurtosis, power spectral density, zero-crossing rate, and root mean square were extracted as statistical features from the raw EEG data. After parameter tuning, the GRU-based model achieved an excellent average accuracy value of 95.94% and also yielded precision, recall, and F1-scores within the range of 0.95 to 0.97 over 5-fold cross-validation. This shows that GRU works well in classifying mental states based on the EEG data.

Downloads

Download data is not yet available.

References

J. J. Bird, L. J. Manso, E. P. Ribeiro, A. Ekart, and D. R. Faria, “A Study on Mental State Classification using EEG-based Brain-Machine Interface,” 9th International Conference on Intelligent Systems 2018: Theory, Research and Innovation in Applications, IS 2018 - Proceedings. Accessed: Dec. 13, 2023. [Online]. Available: https://ieeexplore.ieee.org/document/8710576/

A. Craik, Y. He, and J. L. Contreras-Vidal, “Deep learning for electroencephalogram (EEG) classification tasks: A review,” 2019. doi: 10.1088/1741-2552/ab0ab5.

S. Lim, M. Yeo, and G. Yoon, “Comparison between concentration and immersion based on EEG analysis,” Sensors (Switzerland), vol. 19, no. 7, 2019, doi: 10.3390/s19071669.

Z. Ye et al., “Emotion recognition based on convolutional gated recurrent units with attention,” Conn Sci, vol. 35, no. 1, 2023, doi: 10.1080/09540091.2023.2289833.

H. Gupta, O. Sharma, D. Bhardwaj, J. Yadav, and Inderjeet, “EEG Signal Based Multi Class Emotion Recognition using Hybrid 1D-CNN and GRU,” Aug. 04, 2023. doi: 10.21203/rs.3.rs-2939343/v2.

J. J. Bird, A. Ekart, and D. R. Faria, “Mental Emotional Sentiment Classification with an EEG-based Brain-machine Interface,” Proceedings of theInternational Conference on Digital Image and Signal Processing (DISP’19). Accessed: Dec. 13, 2023. [Online]. Available: http://jordanjamesbird.com/publications/Mental-Emotional-Sentiment-Classification-with-an-EEG-based-Brain-machine-Interface.pdf

W. C. L. Lew et al., “EEG-based Emotion Recognition Using Spatial-Temporal Representation via Bi-GRU,” in Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, 2020. doi: 10.1109/EMBC44109.2020.9176682.

S. Nakagome, A. Craik, A. Sujatha Ravindran, Y. He, J. G. Cruz-Garza, and J. L. Contreras-Vidal, “Deep Learning Methods for EEG Neural Classification,” in Handbook of Neuroengineering, 2022. doi: 10.1007/978-981-15-2848-4_78-1.

H. Altaheri et al., “Deep learning techniques for classification of electroencephalogram (EEG) motor imagery (MI) signals: a review,” Neural Comput Appl, vol. 35, no. 20, 2023, doi: 10.1007/s00521-021-06352-5.

J. J. Bird, D. R. Faria, L. J. Manso, A. Ekárt, and C. D. Buckingham, “A deep evolutionary approach to bioinspired classifier optimisation for brain-machine interaction,” Complexity. Accessed: Dec. 13, 2023. [Online]. Available: https://dx.doi.org/10.1155/2019/4316548

W. Choi, M. J. Kim, M. S. Yum, and D. H. Jeong, “Deep Convolutional Gated Recurrent Unit Combined with Attention Mechanism to Classify Pre-Ictal from Interictal EEG with Minimized Number of Channels,” J Pers Med, vol. 12, no. 5, 2022, doi: 10.3390/jpm12050763.

M. K. Chowdary, J. Anitha, and D. J. Hemanth, “Emotion Recognition from EEG Signals Using Recurrent Neural Networks,” Electronics (Switzerland). Accessed: Dec. 14, 2023. [Online]. Available: https://www.mdpi.com/2079-9292/11/15/2387

W. Wang, B. Li, and H. Wang, “A Novel End-to-end Network Based on a bidirectional GRU and a Self-Attention Mechanism for Denoising of Electroencephalography Signals,” Neuroscience, vol. 505, 2022, doi: 10.1016/j.neuroscience.2022.10.006.

Y. Gao, B. Gao, Q. Chen, J. Liu, and Y. Zhang, “Deep convolutional neural network-based epileptic electroencephalogram (EEG) signal classification,” Front Neurol, vol. 11, 2020, doi: 10.3389/fneur.2020.00375.

L. A. Moctezuma, Y. Suzuki, J. Furuki, M. Molinas, and T. Abe, “GRU-powered sleep stage classification with permutation-based EEG channel selection,” Sci Rep, vol. 14, no. 1, Dec. 2024, doi: 10.1038/s41598-024-68978-4.

Gst. A. V. Mastrika Giri and L. Radhitya, “Musical Instrument Classification using Audio Features and Convolutional Neural Network,” 2024. [Online]. Available: http://jurnal.polibatam.ac.id/index.php/JAIC

Published
2025-01-10
How to Cite
[1]
G. A. V. M. Giri, N. A. Sanjaya ER, and I. K. G. Suhartana, “Application of Gated Recurrent Unit in Electroencephalogram (EEG)-Based Mental State Classification”, JAIC, vol. 9, no. 1, pp. 8-15, Jan. 2025.
Section
Articles