EEG Motor Imagery Classification in Node.js with BCI.js

EEG Motor Imagery Classification in Node.

js with BCI.

jsPierce StegmanBlockedUnblockFollowFollowingJan 22Brain-computer interfaces (BCIs) allow for the control of computers and other devices using only your thoughts.

A popular way to achieve this is with motor imagery detected with electroencephalography (EEG).

This tutorial will serve as an introduction to the detection and classification of motor imagery.

I’ve broken it down into five parts:Loading the dataFeature extractionTraining the classifierTesting and analyzing the resultsImproving resultsYou can find the complete code and data used in this tutorial in its GitHub repo.

Part 1 — Loading the DataFirst, make a new directory for the project.

Then, within the project’s directory, download the data to a folder called ‘data.

’ We will be using the ‘Four class motor imagery (001–2014)’ data set found at http://bnci-horizon-2020.


I’ve made the data sets available as CSV files which you can download below:righthand-training.




csvYou can also download the data using this script for bash or this script for PowerShell.

If you want to experiment with data from a different subject or data set, the script to generate these CSVs from the .

mat files is available in the GitHub repo for this tutorial.

You can also find CSV files containing other imagined movements in the data/ directory.

We’ll load the data into Node.

js using BCI.


If you don’t have Node.

js installed, you can download it here.

Once you have Node.

js and NPM installed, install BCI.

js by running this command in your project directorynpm install bcijsNow we can start on classify.

js in the root of the project directory.

Begin by requiring BCI.

js and loading the relevant CSV files.

As the CSV load methods are asynchronous, we have to wrap them in an asynchronous function and use the await keyword, so Node.

js doesn’t continue until the files are loaded:Part 2 — Feature ExtractionOnce the data has been loaded, we will use common spatial pattern (CSP) as a part of the feature extraction method.

CSP attempts to project the data so that when imagining feet movement, the first signal will have a high variance, and the last signal will have the lowest variance, while the contrary occurs when imagining right hand movements.

With this in mind, we can use the variance of each CSP signal as a feature vector for classification.

This tutorial is following the method described by Christian Kothe in his lecture on CSP.

He has great lectures on CSP and BCIs in general, if you want to learn more.

Now we can start on the feature extraction method.

Every 750 samples in the CSV is from a separate 3 second trial.

We can use the windowApply method from BCI.

js to select the data from each trial.

The third parameter is the size of the window, and the last parameter is how many steps the window should take each iteration:For each trial, we split the data into approximately quarter second windows, or epochs, which we can generate a feature vector from using CSP.

Taking the log of the variance makes the data more normally distributed, which will help when training the classifier later.

We pass epochSize/2 as the window step, so there is a 50% overlap between windows.

The method bci.


logvar is used to compute the log of the variance of each CSP signal.

The value ‘columns’ is passed, because in the 2d array ‘cspSignals,’ each column is a signal and each row is a sample.

Finally, we concatenate the features from each trial into one long array of feature vectors.

Part 3 — Training the ClassifierNow that we have features, let’s learn a classifier.

Linear discriminant analysis (LDA) is a popular classifier for BCIs.

It uses the mean and standard deviation of training set data to draw a dividing line (or hyperplane in higher dimensions) between the two classes.

You can find a visual demonstration of how LDA works at https://bci.



To learn an LDA classifier, you can use the ldaLearn method:Part 4 — Testing and Analyzing the ResultsNow that we’ve learned a classifier, let’s evaluate it using the testing sets.

To do this, we will use the ldaProject method.

It returns a negative number when the prediction is the first class and a positive number when the prediction is the other class.

If it returns a value near zero, then it is more unsure about the classification.

Once we use ldaProject to predict the classes of the training set data, we can evaluate its performance using a confusion matrix.

A confusion matrix places data into a 2×2 matrix, where each cell represents the following:We can calculate this confusion matrix using the confusionMatrix method.

Finally, we compute the balanced accuracy (average of the accuracy for feet movements and accuracy for right hand movements) to see how well our classifier performed (complete script can be found here):And running this code, we achieve the following results:Confusion matrix:1243 275198 1342Balanced accuracy:0.

84513This is a decent result, out of 3058 features, 2585 were classified correctly.

However, we can achieve an even higher accuracy by having the classifier return ‘unknown’ when ldaProject returns values near zero.

By filtering out values that LDA predicts to be near zero, we can see how the addition of an ‘unknown’ would affect our results:With this modification, we get the following results:Confusion matrix:925 8167 940Balanced accuracy:0.


17% of the classifications are returned as unknown.

While this is a high percentage, at four classifications per second, throwing out about every 1 in 3 results still yields quite a fast response time.

And at the cost, we’ve achieved 92.

6% accuracy.

We can get this even higher with a few improvements.

Part 5 — Improving ResultsBandpass FilteringWe can further improve results by bandpass filtering the data to remove noise and unwanted frequency bands.

In Christian Kothe’s lecture on CSP, he recommends filtering between 7 and 30 Hz.

We can add a bandpass filter using fili:npm install filiThen, modify the feature extraction method to include a bandpass filter:This modification gives us the following results:Confusion matrix:898 5454 960Balanced accuracy:0.

94501In addition to the accuracy going up to 94.

5%, the percent of results classified as unknown has gone down to 21.


In the modified code, you can see we set filterOrder to 128.

The higher order filter means it will more precisely filter between 7 and 30 Hz.

Fili contains an interactive demo of how filter order affects the filter.

For example, passing in the following parameters:Type: FIR, Fs: 250, Fc: 7, Fc2: 30, FIR order: 32, Calculation: Sinc BandpassWe get:But changing the filter order from 32 to 128, we get:In the first example, only around 19 Hz is kept near 100% while the other bands are minimized.

However, in the second example with the higher order filter, values between 7 and 30 Hz are kept closer to 100% while the other bands are filtered to around 0.

Finally, if we allow a higher number of unknowns, we can achieve an even higher accuracy.

By changing the threshold in the classify method from 0.

5 to 1, we increase the percent unknown to 47.

7%, and get the following results:Confusion matrix:672 1622 599Balanced accuracy:0.

97066Logistic RegressionWe can also treat LDA as a dimensionality reduction method and pass the output to logistic regression, which estimates the probability of a sample being in each class.

To do this, we can use js-regression:npm install js-regressionThe above code classifies 58.

4% as unknown and gives the following results:Confusion matrix:568 1011 451Balanced accuracy:0.

97944And that’s it!This was my first tutorial.

I appreciate any feedback and will answer any questions.

Follow me on Medium and Twitter if you’re interested in more BCI related posts.

If you want to support BCI.

js, leave a star on the GitHub repo, and pull requests are always appreciated.


js on GitHub:pwstegman/bcijs EEG signal processing and machine learning in JavaScript — pwstegman/bcijsgithub.


js on NPM:bcijsEEG signal processing and machine learningwww.


comOther libraries:fili — A digital filter library for JavaScriptjs-regression — JavaScript implementation of linear regression and logistic regressionBack in 2017, I began work on WebBCI, which has since become BCI.


I later published a paper on WebBCI and discussed preliminary benchmarks and the effectiveness of processing EEG in JavaScript.

If you’re curious or want to use BCI.

js in a published work, please check out my paper on the topic:P.

Stegman, C.

Crawford, and J.

Gray, “WebBCI: An Electroencephalography Toolkit Built on Modern Web Technologies,” in Augmented Cognition: Intelligent Technologies, 2018, pp.



. More details

Leave a Reply