LongCut logo

L8: MNE tutorial Part #1 - Load and Segment continuous EEG data

By Berdakh Abibullaev (EEG, BCI & Machine learning)

Summary

Topics Covered

  • Full Video

Full Transcript

hello students today I'm going to introduce you the toolbox called M this will be one of the main toolboxes we will be using for implementing our midterm or final project the m is a tool

that can be used for EEG analysis Meg analysis or even for electrogram all types of Time series data that can be recorded from the brain as well as other

physiological signals so first thing you need to do is just type the website address m. tools and go to this website

address m. tools and go to this website and delete the information about the installation okay M many is a python based tool so you need to have python

distribution installed pre-installed so you can read some relevant information here so if you go to installing many python well depending

on your operating system there are different ways to install okay if you have Linux okay this is a yml file that contains all the dependencies and

library that you may need to install for example here or Mac similar and windows so after the installation browse through the

documentation and learn more about these tools and it comes with a lot of tutorials very useful in the analysis of

EG mg and other types of the data okay so continuous data pre-process in and segmentation using segmenting the

continuous data estimating RP time response analysis many examples ready to use and it can also be used for

Designing realtime BCI system M has realtime module where you can acquire data in real time and decode or analyze it in real time so if you don't want to

install the python on your PC there is another option which you can use online cloud computing platform one of them is collab

Google collab callab research.

google.com so what is callab callab is call laboratory which allows you to write and execute python in your browser you don't need any uh configuration and

you have free access to GPU and you can also share the Jupiter notebooks right just explore you can analyze visualize perform signal

processing here on collap and also you can link your collab to your own Google Drive for example now let me open one of the notebooks related to

m& you can navigate to your Google drive folder where the Jupiter notebook is located that you want to open in Google collab close

this open with callab once you open the notebook in Google collab it will appear as follows and you

may want to First connect to the runtime connect in Cola let me see so yeah we have some

settings here so python runtime type GPU or TPU that Google provides for free you can also use

Ram okay and then save so I'm connecting to the Google uh

engine so once you're connected you can install M using as simple as typing pip install

M okay yeah so it will take just few seconds and the size of M is not that big second I said that I have some files on

my drive so I want to open it so second step is to connect your

Google collab jupyter to your Google Drive for that you just type this go here

then okay right so I have now connected mounted my Google Drive call up so now I want to

first import am this is import am Min and I will use some M bloodly plot function

okay second I want to load this file this is Erp file and I want to show you today how to load continuous data segment and then perform some simple

visualization next step is to access my Google Drive okay let me just try to do this here so you need to find the Google

Drive my drive and navigate to the folder where your data is located okay so jup the

notebooks all right so copy path now I need to navigate to that directory okay CD to that

directory let me check okay how can I okay so now we can see what I have in my current director

so these are the files so I want to load this file this is Erp and I call it as F

name we have already imported Min and then some other useful functions that em has is the reading different

types of the files different files with different file extensions uh recorded by different EG

mg device so m m EO module contains input output functions that allows you to load

data so this is the name of the files that I want to load from this detector so let's try to load it

draw mhm so you see we have draw data loaded so am Min is object oriented python based toolbox or tool or

Library therefore every object in Python has their corresponding methods right so to access what you can do with this

object just can do a few things so first roll and then tap roll raw okay shows different methods coupled

with object encapsulate so row is now continuous EG data what we can do add channels events remove a pent crop copy

drop channels we can also use filter perform some BP pass filtering and many other things plot another widely used function

save so I highly recommend you to explore so functions just to learn another function that I like is just calling deer which shows you what

kind of methods this object has okay those are private methods Rel to the

object public ones and then if you want to find out what does to the

object row do crop as question okay so raw. crop question provides you some brief

information this help file looks like it crops the data limit the data from R file go between specific

times okay the T mean is assumed to be zero for all subsequent calls so you can find um the helper or informations

regarding every every object so R filter there another very widely used function because the first step in EG

analysis is filtering right B pass filtering High Pass filtering low pass filtering those kind of operations also take a look at three

sampling yeah drop channels some bad channels just to explore yourself let's try to do simple filtering so we can Define the band pass filter as

follows so low frequency cut and then high frequency cut between let's say 1 to 20 Hertz all right so yeah we run into the

problem some error why because when we load the the data we need to the m by default doesn't load it into

memory it just links to the to the file in the directory right so we need to define the option called preload through

let's try again okay now we load the data again with the following set so preload load the data and if you do filtering after that we can see it works so we are filtering

or B pass filtering the the continuous data set we just loaded now let's write the plot how it looks like so if you plot this continuous State set

we will have the following visualization uh which shows the location channels and then time or EG channels right fp1

fp2 f up to cp6 so those are the subset of channels okay so and then these are time points this just one uh window time

Point window which we are visualizing EG and Meg data is very noise so we can see different type of noise in the data one of the common

example is shown here this big amplitude waves look this these this those are called as eye movement artifacts okay that's related to eye blink basically

eye blink artifact when you blink your eye your brain data is affected or EG sensors picks up those artifact as well and also we can if you visually analyze we can find some heartbeat related

activities as well but here we don't see them they will look like something like this hard bit later so um after loading your data

one of the first step is to pre-process clean those kind of artifacts the M provides different ready

to use modules for cleaning data one of the algorithm uh for cleaning these kind of I bling for EOG artifacts is called

IC IA stands for independent component analysis okay so M many pre-processing module has different type of pre-processing methods

which you can use well without the diving or into the theoretical formulation of IC let's

just see in practice I mean how it works how it can clean the data first is we instantiate the ICA object from the

M library okay and there are some hyper parameters that you need to set such as number of components that you want to decompose the EG signal all

right and then once we have this I we can use if you're familiar with psychic learn I mean it comes with helper functions or functions which allows us

to fit and also sometimes predict so we pass a row copy of our

data to IC object and Define a band pass filter between 8 and 35 Hertz let's

see so the bage filter between 8 to 35 using F filter and we can see some details here regarding the filter Z one pass

zero phase noncausal B pass filter and for IA okay so we have in total 63 EG

channels let's see yeah so once this process is done we have we can plot IA ICA components independent component

and so these are the Ia components that decompose the signal into different components so the task after performing this ICA is remove the components that

look like noise here we need some domain knowledge but usually you want to find some some kind of activities which is close to I I

location and looks like like a normal anomaly or AB normal activity non e like so let's select these

components okay identified by the index try to REM move well theoretical formulation of ICA will be covered later in our lectures but I just want to show

you here uh some examples so ICA next this our IC object that has that contains a decomposed data we say

IA exclude the following okay and then we also can try to find the bad bad um indexes based on some

thresholding if the amplitude of okay row signal I mean row data is higher than the reference Channel by two standard deviation then identify it as a bad

index we use this channel as EOG Channel now this is what we have from

what then apply IC and then plot so if you compare before and after applying IC this is original R signal data and this

one is after ICA you can see that those I blink related components are gone so we cleaned them okay those are cleaning I blink related activities is one of the

first step when you perform EG analysis okay before segmentation but we leave the row or Contin State as it

is while here we demonstr on the copy so we didn't touch it now let's think about uh e paring or segmenting this continuous data

enter EPO so usually for epoching the con State we need event markers and then event markers can be find

in in the row object so M has the find events method okay M

find events that will find the annotations on the loaded

object using um steam steam in Steam Channel let's see okay find events in the row object we have loaded and it

says us there are 93 events in total with the following event IDs event identification so 100 could correspond to Target to 100 non-target event and if

you just open this event it's a n array an NP array which has event

markers the time points where a certain event happened so 200 the first event happened around [Music]

3,241 and we can also find 100 second event second class

event we can visualize those events the first 100 events you see so here we have time AIS of EG samples and the location

of time markers or events 100 200 those are event IDs so now can you guess which one could be related to Target Erp and

non-target Erp I said that the data loaded is Erp right evental potential so as I as we can expect the target Erp

will have less number of Trials as compared to non Target with a ratio of 80 to

20% so this is standard stimuli deviant stimuli and we use those time markers to segment our row data into

eox okay so we need to do some some some steps so m many Epoch object which we'll be using for segmenting

require events array and a dictionary of the intend and condition names and then correspond trigger numbers so we need to pass a rle continuous data event array ire array

that contains and event ID we can give name to event ID as standard stimulus Target stimulus let's try so we

just segmented continuous ER e data using M many EPO and segmented trials okay now we can

visualize try to plot those segmented data okay here we

go Channel information and we have event IDs 200 200 200 and 100 non Target Target non-target Target and this dash

line separates to Epoch so Epoch here is the time segment or one trial one observation of EG

yes first trial second third fourth so continuous um the data that has been segmented from continuous it okay we can

observe here some of the noise patterns we have discussed before look this looks like an ey blink artifact in the

data so let's try to apply I on these EPO or segmented data by excluding those components we defined earlier so if you

do this all right then we can apply IA basically first before Epoch applying Epoch so we need to make

sure that preload is true okay um yes and then you can apply

a then let's try to apply some baseline and then try to explore the

epoch object like this so this is a python object we have different methods functions that we can

use for example to access Target Erp related data we say epox Target

or E if you just type box let see what happens so it says that we have Target and then standard so to get the target

related data we have the following comment we can access it or

soer how actually work and non Target a standard [Music] 797 so we can

use this information after the back slash to to get all the data from post

conditions how does EPO Erp look like EPO activity look like after segmentation we can pick one channel by

the channel number ID channel 13 is our CZ Channel in this Epoch well let's see

Epoch first Epoch info contains useful information about the this object including the channel names 63

channel in total so Channel 13 1 2 3 4 up 13 here is identify CZ first channel fp1 and we have some other information

regarding the operations have been done on this data so for example we appli a band pass filter with the following C of frequency one and 20 HZ and also uh the

the object contains the date when the data has been acquired and what kind of sampling frequency has been used in acquiring this data so

100 let's try to visualize one of the channels cz1 from Target Erp EPO Target

CZ here we can see Target Erp onet of stimuli is usually located at zero minus 0.2 means uh 200 Mond 100 millisecond

okay we said that Target Erp appears around 300 millisecond after the stimuli this is how you can see it very clearly positive deflection of the EG waveform

well here we can see some color code information where this represent the col mean amplitude values you can try to visualize other

channel for for instance FC fc5 Channel 8 right let's see8

fc1 oh okay then channel seven will give us FC fc5 so frontal Central electrode located on the right hemisphere five

right also show some some uh Erp activity Target Erp so to ensure as we have as many Audible and standard trials we can run

to equalize those unbalanced data look from 80 to 20 ratio we can try to okay select only balance equalize event

counts does that for us we can do many other things using the m Tool uh on our data first let me show

you that what you have what what you can do after pre-processing is of course saving it in local directory so M allows you to also save it using it this

function called M eo. save or m. save

any M object has save method that allows you to save up to two gigabyte or file size so we need to Define file name okay

those one parameters and usually the many Community adopted this file extension convention which ends with fif okay fif this is called because it's

convention do fif you can also use Python pickle to to save the data D the data let's try so I they just said Epoch

save my my save my Epoch using the following name so let me check if had if it hasn't say it in mind yes this

is how it has been changed so if I try yeah so. fif also a convention okay so you need to provide name here two okay I just change two and then it's

here so it is connected to my Google Drive I'm saving it to my Google Drive okay

now what else do we have here yeah um another thing that I want to show you here is that the

we can Define the duration of each EAC okay so for example t mean to be

from - 500 and plus 1,500 millisecond after the stimuli before stimuli after the stimuli right so total length of two seconds so

in our previous segmentation we use just default teaming TMax which is provide in eoch okay so you can also Define wider window

length Okay which is sometimes useful or required when you perform some time frequency

analysis if I do this let me try and then compare this plot yeah this is the the result of the

segmented with longer time window you can apply IC equalize and then same save let me do it here but in the next lesson lecture I will show you how to

perform time frequency analysis which is one of the important step in the analyzing on in the analysis of the brain data brain

signal times St such as EG mg I can double check Okay small okay here this is what I

have just to conclude this first demo so M has a lot of methods okay that is ready readily available for analyzing your EG data M

data but under the Hoot the M uses most most of time uses scipi numpy libraries okay but of course not always therefore

if you want to extract data and Take It Outside M you can do it and if you want to use numpy or numpy based Matrix operations or using some for transform

from ccii module of the Python so EPO how how can we get the data out EPO

then no get okay get data this method allows you to get the data out let's see

X let's call it X and try to get an ire ire let's try to take the dimension out

yeah X or type X lumai and the array okay what what we have here we have 63 Channel plus there

is one stimulus Channel which we had to drop and this corresponds to time points time points wait no this corresponds to yes

yes this is time point this is number of Trials number of observations in this given data and then epox Target

get shape right so if you want to extract only target Rel Erp you have to make you have to make sure that you're selecting the

Target and getting the data okay so it is half of 220 so once you take the X outside M and you can perform your favorite

M or Pi based operations so it's very very convenient using M and plug it back to M object

later if you want to do some operation outside M or if you cannot find the the method of M that you want to work with highly recommend you explore to learn

more about the minu

Loading...

Loading video analysis...