Recurrent neural networks in disabled communication

An extended research project titled Prospects of brain-computer interfaces in communication for disabled children

Tahmid Azam

Date
5/1/2022
Size
273 words, 2 min read
Languages & Frameworks
MATLAB

All figures, except for Figure 1, Figure 3, Figure 4, Figure 5 and Figure 6 were either programmatically generated using MATLAB and EEGLAB, created in Adobe Photoshop, or plotted in Microsoft Excel for this study. All scripts and functions for both the generation of figures and the research are original and were written for this study. The scripts, functions, interview transcript, figures and the processed dataset can be found in the study’s repository.

This project was presented on February 3rd, 2023, at Magdalen College School, Oxford at the Waynflete Studies Evening. Content from slides displayed in the presentation are included below in addition to project content.

This project is dedicated to my brother, Tahsin.

Abstract

Communication is a pillar of societal interaction for humans, pivotal from an early age. An incapacity to communicate is a universal obstruction to a child’s social and emotional development. This project aims to explore the impacts of speech, language, and communication needs, and how an augmentative and alternative communication system, specifically a brain-computer interface, can lessen them. Consumer electroencephalography systems for recording electrical activity in the brain are becoming more affordable. Additionally, the number of large datasets available is increasing, enabling neural networks to become a prospective option for motor imagery classification. Moreover, the high mobile computing performance and power efficiency of today’s devices can host these neural network classifiers. Both brain imaging and classification are integral to the workings of a brain-computer interface. The results of this project find that a unidirectional, long short-term memory, recurrent neural network can classify left- and right- hand imageries to an accuracy of 91.95%, demonstrating a high level of capability.