Abstract
This paper aims to design a hardware optimized machine learning based Electrooculogram (EOG) signal processor to classify eye movements. EOG is the recording of the resting potential of the retina and cornea. This recording may help to recognize eye movements which can be utilized in many Human-Computer Interface (HCI) applications. In this work, EOG signals are used to identify six different eye movements (up, down, normal, right, left, and blink) adopting a linear support vector machine (SVM) classifier with a software accuracy of 97.92%. The system is implemented in Zynq UltraScale+ FPGA SoC. The whole system is designed with a combination of serial-parallel processing for hardware optimization. At the end, the performance of the proposed design is outlined. The average hardware accuracy of the implemented system is 95.56%. The resource and power utilization of the implemented system is analyzed. The on-chip power consumption for this design is 1.446 watts, in which dynamic power is 0.850 watts, and static power is 0.596 watts. The comparative study confirmed the efficacy of the prototype.
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024
Original language | English |
---|---|
Number of pages | 14 |
Journal | Journal of Signal Processing Systems |
Online published | 12 Oct 2024 |
DOIs | |
Publication status | Online published - 12 Oct 2024 |
Research Keywords
- Classification
- Electrooculogram (EOG)
- Eye movements
- Field programmable gate array (FPGA)
- Support vector machine (SVM)
Publisher's Copyright Statement
- COPYRIGHT TERMS OF DEPOSITED POSTPRINT FILE: This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: https://doi.org/10.1007/s11265-024-01936-5.