eyeSay: Brain Visual Dynamics Decoding with Deep Learning & Edge Computing

eyeSay: Brain Visual Dynamics Decoding with Deep Learning & Edge Computing 150 150 Transactions on Neural Systems and Rehabilitation Engineering (TNSRE)

Author (s): Jiadao Zou;Qingxue Zhang

Brain visual dynamics encode rich functional and biological patterns of the neural system, and if decoded, are of great promise for many applications such as intention understanding, cognitive load quantization and neural disorder measurement. We here focus on the understanding of the brain visual dynamics for the Amyotrophic lateral sclerosis (ALS) population, and propose a novel system that allows these so-called ’lock-in’ patients to ’speak’ with their brain visual movements. More specifically, we propose an intelligent system to decode the eye bio-potential signal, Electrooculogram (EOG), thereby understanding the patients’ intention. We first propose to leverage a deep learning framework for automatic feature learning and classification of the brain visual dynamics, aiming to translate the EOG to meaningful words. We afterwards design and develop an edge computing platform on the smart phone, which can execute the deep learning algorithm, visualize the brain visual dynamics, and demonstrate the edge inference results, all in real-time. Evaluated on 4,500 trials of brain visual movements performed by multiple users, our novel system has demonstrated a high eye-word recognition rate up to 90.47%. The system is demonstrated to be intelligent, effective and convenient for decoding brain visual dynamics for ALS patients. This research thus is expected to greatly advance the decoding and understanding of brain visual dynamics, by leveraging machine learning and edge computing innovations.

Access the Full Paper on IEEE Xplore®

Sign-in or become an IEEE member to discover the full contents of the paper.

Subscribe for Updates

Join our mailing list to receive the latest news and updates.