摘要:AbstractBrain-computer interfaces (BCI) promise to restore essential hand functionality of motor-impaired individuals by controlling a robotic prosthetic or orthotic hand using interpreted neural information extracted from electroencephalography (EEG). Essential hand functionality includes movements such as wrist extension and flexion, finger extension and flexion and tripod pinch. There are a limited number of EEG studies focusing on the combination of these movements. A novel grouping of these movements was investigated in the current study: i.e., right wrist, right finger, left wrist and left finger movements. The associated EEG was discriminated in a novel four-class BCI problem. Real and imagined movement data was recorded from healthy test subjects. Independent component analysis (ICA) was used as a spatial filter. Time-frequency-based features were extracted from the mu and beta EEG frequency bands. A subset of features were selected using the Bhattacharya distance. Two classifier configurations were compared i.e., a single-stage four-class classifier architecture and a two-stage four-class architecture. Artificial neural networks, support vector machines, Mahalanobis distance clustering and a group classifier were compared within these architectures. The two-stage group classifier performed best with average accuracies of 68% and 62% for the real and imagined movements respectively. These results imply a possibility of multiclass EEG discrimination of hand movements. Time-frequency plots of the selected features suggest that different underlying neural mechanisms control wrist and finger movements. This study implies that an EEG-based BCI can control more degrees of freedom of a prosthetic or orthotic hand.