EEG / Evoked Potential Processing

The EP (Evoked Potential) Lab in the Technion is measuring and analyzing EEG signals.

Abstract

The EP (Evoked Potential) Lab in the Technion is measuring and analyzing EEG signals.
the Data is gathered from 21 different points along the scalp, into a big data matrix.
The EP lab needed a tool for analysis of said data in Matlab, and our project provided that tool. We wrote a Matlab GUI which input is the Data Matrix, and is capable of graphically presenting the data analyzed by two different methods: PCA (Principal component analysis) and CNN (Competitve Neural Net). Both methodes are used to reduce data dimensionality, the first in a statistical way, and the second in a completely different approach – a learning machine.

 

The Statistical approach: PCA:

Principal component analysis (PCA) is a classical statistical method. It is primarily a linear transformation of the data matrix: eq1
to a different orthogonal basis representation, where the dimensions used are created so that the first one explains as much is it could of the data variance, the second dim. a bit less, and so on, where it’s presumed that the last dimensions, who explain little variance, are irrelevant, and are, basically, noise.

In order to make the transformation we first take the covariance matrix: eq2
eq3
From a symmetric matrix such as the covariance matrix, we can calculate an orthogonal basis by finding its eigenvalues and eigenvectors. The eigenvectors eq4 and the corresponding eigenvalues eq5 are the solutions of the equation:

eq6
By ordering the eigenvectors in the order of descending eigenvalues (largest first), one can create an ordered orthogonal basis with the first eigenvector having the direction of largest variance of the data. In this way, we can find directions in which the data set has the most significant amounts of energy.

by normalizing the eigenvalues vector, we can tell what percentage of the energy each eigenvector explains. by cutting every vector that explains less then 5% (for example), and reconstructing the data using only a few eigenvectors, we can filter the noise, and remain with the phenomena tested.

 

The Learning Machine approach: CNN:
1-1
Competitive Neural Network (CNN), is originally designed as a spontaneous clustering machine. the basic approach is to get several artificial neurons:
(Learn More)
and initialize their weights vector to be the mean of the data.
Each step of the learning algorithm is comprised of the following steps:
a. choose a random data point
b. find the closest neuron (in terms of euclidic distance from the weights vector)
c. updating this vector in one learning step of size µ towards the data point

Schematic description:
1

 

The developed tool: GUI in Matlab 6.5:
the project consisted of three parts: realization of the two above algorithms in Matlab, and GUI (Graphical User Interface) writing.
The GUI is able to analyze data in several ways, and export the analysis to Matlab or as comma separated values (detailed description can be found in the project book)

A sneak peek:
2

 

Conclusions

The main goal of this project was to give the researchers at the EP lab a reliable tool for data analysis. That was accomplished.

However, looking at both methods, it is our belief that PCA is the tool of choice for this kind o data:

1. PCA is a much faster algorithm then CNN, and for the large quantity of data produced in a single EEG, it’s a very decisive factor.
2. The principal components that arise from the process are more meaningful then those from CNN.
3. One can easily control the amount of data & energy lost in the dimensionality reduction in PCA, not so in CNN, where the number of neurons used is very critical, and unknown in advance.

 

Acknowledgment

We are grateful to our project supervisor Dr Danny Lange for his help and guidance throughout this work.
We are also very grateful to Alon & Ilan from the Evoked Potentials Lab, Gutwirth Park, Technion, for their enlightening explanations and comments.