NeuroGrasp: AI-Driven EEG-Based Brain-Computer Interface for Intuitive Control of Neuro/Prostheses in Spinal Cord Injured and Amputees

The main principle of this project is exemplified as follows: the end-user attempts to perform e.g., a palmar grasp, the BCI detects this movement through EEG pattern recognition, and a command is sent to the (neuro)prosthesis to perform the desired grasp. The Graz BCI group was the first group to combine a BCI with a neuroprosthesis in a person with SCI. Furthermore, we could show that upper-limb movement attempts can be decoded from the low-frequency EEG (i.e., movement-related cortical potentials). We proved this in real-time with a person with complete hand paralysis after spinal cord injury. However, the error rate of such a system is still too high. Building on this foundation, we aim to make BCIs more intuitive and accessible by utilizing artificial intelligence (AI), specifically machine learning and deep learning techniques. These technologies enable us to move beyond conventional EEG decoding methods, revealing more rich and informative patterns within brain signals. The newly developed AI-driven BCI will be tested as a control modality in SCI individuals and amputees during real-life scenarios.

NeuroGrasp Team
image/svg+xml

Prof.Dr. Gernot Müller-Putz
Dr. Kyriaki Kostoglou
Kathrin Sterk

Project Partner
image/svg+xml

AUVA Rehabiliationszentrum Weißer Hof

AUVA Rehabiliationszentrum Tobelbad