BrainCheck is a neural decoding application intended for medical research use in patients with Locked-In-Syndrome. BrainCheck allows effective administration of questionnaires to non-verbal paralyzed patients. BrainCheck utilizes a subset of the MindBigData dataset, which consists of 1,207,293 2-second EEG signals from 4 commercial devices captured with the stimulus of seeing a handwritten digit (0 to 9). Data from the 5-channeled Emotiv Insight was normalized using MinMax Scaling, followed by forward and backward filling. As a final step of preprocessing, powerline artifact was removed and a Morlet Wavelet Transformation is applied for feature extraction. The model uses Tensorflow and consists of 2 convolutional layers (depthwise conv2D) with 8 and 16 filters respectively, each of size (1,2). Following is a pooling layer (maxPool2D) with size (1,2), whose outputs were flattened, and passed to 3 fully connected layers with 128, 64, 32 neurons. The output layer consists of 10 neurons, corresponding to the digits 0-9. ReLU is the activation function for all layers, except the output layer, which used SoftMax. The model is fitted using categorical cross entropy, adam, and RMSE. The model and its saved weights are then loaded into a python script that uses socket.io to negotiate the interactions between the model and client. For our demo, raw data is passed from our dataset into the model, and confidence values along with the prediction is sent via WebSocket to the client. As the user iterates through the questions, the data is saved and can be exported to JSON at the end.

Built With

Share this project:

Updates