Not being able to communicate while still being conscious is a horrifying prospective for many patients worldwide. Patients with motor neuron disorders, trauma or stroke risk losing complete muscle control leading to Locked-In Syndrome (LIS) which leaves them completely paralysed and unable to communicate. This is an intolerable, fearful situation with very low quality of life and extreme burden of care for patients, family, and care givers. Intracranial Neuro Telemetry to REstore COMmunication (INTRECOM) will provide breakthrough for these patients by developing a novel, fully implantable Brain-Computer Interface (BCI) technology that allows for real-time speech decoding and use in the home environment. This BCI system will significantly transcend current technology by providing a high-resolution and sustainable device, combining state-of-the-art hardware and software solutions based on Artificial Intelligence (AI) to liberate LIS patients from their isolation.
A long history of research in developing brain-computer interfaces for the control of upper-limb neuroprostheses for the restoration of hand and grasping movements in indivuduals with cervical spinal cord injury has been demonstrated. The latest project, funded by the European Commission, MoreGrasp introduced new natuaralistic control strategies for this application.
NeuroGrasp
Regaining arm and hand function is one of the highest priorities for people with cervical spinal cord injury (SCI) and upper limb amputations. Prostheses are artificial devices designed to replace a missing limb or restore lost function. While traditional prostheses rely mainly on electromyographical control or external bowden movement for operation, neuroprostheses take a more advanced approach by using neural signals from the brain to control the device. Electroencephalography (EEG)-based brain-computer interfaces (BCIs) provide a noninvasive link between the user’s brain activity and the neuroprosthesis. The main principle of this project is exemplified as follows: the end-user attempts to perform e.g., a palmar grasp, the BCI detects this movement through EEG pattern recognition, and a command is sent to the (neuro)prosthesis to perform the desired grasp. The Graz BCI group was the first group to combine a BCI with a neuroprosthesis in a person with SCI. Furthermore, we could show that upper-limb movement attempts can be decoded from the low-frequency EEG (i.e., movement-related cortical potentials). We proved this in real-time with a person with complete hand paralysis after spinal cord injury. However, the error rate of such a system is still too high. Building on this foundation, we aim to make BCIs more intuitive and accessible by utilizing artificial intelligence (AI), specifically machine learning and deep learning techniques. These technologies enable us to move beyond conventional EEG decoding methods, revealing more rich and informative patterns within brain signals. The newly developed AI-driven BCI will be tested as a control modality in SCI individuals and amputees during real-life scenarios.
This project builds the basis for the next generation of robotic arm control to be used in people with high cervical spinal cord lesions. Non-invasively, we are investigating new methodologies to decode complex arm and hand movements from multi-channel EEG.
Communication is one of the most important things for humans. Individuals with neurodegenerative deseases or even in a minimally consciousness state may be completely unable to communicate with other people. The goal of this project is to develop at least a simple communication tool, based on the analysis of brain signals, to allow those persons at least rudimentary communciation with their environment.
Passive BCIs can be used applied in many situations in, e.g., for detecting errors during control, measuring mental workload or attention. This project deals with the investigation, understanding and finally the detection of perceived perturbations a human recognizes in several scenarious.
Can we develop technologies that allow users to let their minds wander to increase the likelihood of "Eureka" or "Aha" moments? Through an experimental approach and triangulation of self-reports and synchronized physiological measures (electroencephalography (EEG) and eye tracking), we seek to answer this question. Our results may have important new implications for the design of digital workplaces and may help to inspire the irreplaceable ability of humans to find innovative solutions.