Non-invasive Online-self-correcting Closed-loop Brain Computer Interface for Decoding and Control of Motor Imagery Hand Movement Kinematics 

Avatar for Vinod A PRASAD
Vinod A PRASAD    
Professor

Read More 

Avatar for Vivek BALACHANDRAN
Vivek BALACHANDRAN    
Associate Professor

Read More 

Avatar for Li Khim KWAH
Li Khim KWAH    
Associate Professor

Read More 

Avatar for I-ling YEH
I-ling YEH    
Assistant Professor
Avatar for Ng Yee Sien (SGH)
NG Yee Sien (SGH)    
Researcher
Avatar for Jijomon C M
Jijomon C M    
Researcher
Avatar for Sagila Gangadharan K
Sagila Gangadharan K    
Researcher
Avatar for Devika K M
Devika K M    
Researcher

The project aims to develop an online, closed-loop, self-correcting Motor Imagery-based Brain-Computer Interface (MI-BCI) to decode imagined hand movement direction and speed.

Using error detection, the system adapts to improve decoding accuracy over time.

The framework will be validated on both healthy participants and stroke patients to ensure reliability for neurorehabilitation applications.
 

Project Deliverables/Outcomes/Impact:
  • A BCI that enables more natural control of motor activities with increased accuracy and robustness.
  • Improved standard of care of stroke patients through BCI-based motor rehabilitation. 

     

 

A person participating in an EEG experiment, wearing a sensor-laden cap connected to a recording device, while watching a computer monitor displaying a red dot.