You certainly do not want your EEG experiment to fail mid-test, so before carrying out a full study with 100 participants start small and run some pilot sessions in order to check if everything is working properly. In this blog post, we would like to shed some light on 5 key aspects that are crucial for EEG data processing. It’s also possible to connect a variety of other sensors that aren’t natively integrated by using the Lab Streaming Layer (LSL) protocol. This allows data from other sensors to be sent in to iMotions and synchronized with other data sources. This is complemented by the possibility to use the open API to connect essentially any other data stream. Virtually any data-producing device can be connected into iMotions, creating new research possibilities.
These steps are often overlooked, but they are crucial for obtaining reliable and high-quality results. Among others, proper re-referencing and SNR enhancement are essential steps in EEG signal analysis that require ongoing updates to the latest techniques in order to be applied correctly. The PSI was proposed as a phase-synchronization measure derived from the complex coherence function [227], quantifying the change in the phase difference between consecutive bins. The weighted PSI stability, a variant of the PSI, has been defined as an artifact-resistant measure to detect cognitive EEG activity during locomotion [228].
Used methods include simple linear methods (LDA for classification and Multiple Linear Regression), SVM like kernel methods, random forests, neural networks (see Section 4 for the Deep Learning methods), or a more sophisticated combination of methods. The scalp power spectra map is demonstrated to be systematically altered in spontaneous EEG with a non-neutral reference signal mixed in other channels, and it is heavily dependent on the chosen reference scheme [344]. Additionally, the choice of reference can significantly impact measures of correlation in both time and frequency domains [345] by introducing zero-lag correlations, making the interpretation of EEG functional brain-network characteristics more challenging. In general, it is demonstrated that approximations of the infinity reference (like REST) have better performance with respect to other referencing schemes in estimating FC, especially when using coherence [346]. In brain connectivity studies, surrogate data must be consistent with the null hypothesis of no neural interaction while sharing all other properties of the original data.
Electroencephalography (EEG) is a valuable tool used in neuroscience research to measure brain activity. Working with EEG data requires careful attention to detail and specialized knowledge of data analysis techniques. Whether you are new to EEG data analysis or looking to improve your skills, there are several key steps to keep in mind.
It stemmed from the research of neuroscientist Frederic Gibbs, Hallowell Davis and William Lennox around epileptiform spikes, interictal spike waves and the three cycles of clinical absence EEG seizures. Gibbs and scientist Herbert Jasper concluded that interictal spikes are a distinct signature of epilepsy. Brain activity within a frequency range comprised between 4 and 7 Hz is referred to as Theta activity. Theta rhythm detected in EEG measurement is often found in young adults, particularly over the temporal regions and during hyperventilation. In older individuals, theta activity with an amplitude greater than about 30 millivolts (mV) is seen less commonly, except during drowsiness.
1. Preprocessing EEG Data
In this case, contaminated data portions are replaced with interpolated data using surrounding data channels or time points (in the image below the red lines represent the corrected signal). For example, one might be interested in event-related potentials time-locked to the onset of a specific visual stimulus. If the participant blinks at that very moment, the EEG might not reflect the cortical processes of seeing the stimulus on screen. By making sure that the methods of choice return the desired outcomes, you are able to maximize scientific research standards such as objectivity, reliability, and validity.
For ICU patients, several measures are typically taken to reduce the disturbances from various medical instruments, devices, and lines used. Mechanical restraints and not chemical restraints might be necessary at times and ensure a proper EEG recording. In the event of a seizure, a large super-synchronous neuronal discharge is created from an abnormal brain network. EEG evaluation provides important information about the localization and the spread of such discharges. A related technique to the EEG is MEG, which does not record electrical activity but, rather, utilizes sensors to capture magnetic fields generated by the brain.
Delta waves are characterized by low-frequency (about 3 Hz), high amplitude waves. Delta rhythms can be present during wakefulness — they are responsive to eye-opening and may be enhanced by hyperventilation as well. After the test is complete, the technician will remove the electrodes from your scalp. No matter what you go with, it is important to decide on an analysis pipeline before looking at (or even collecting) your data to avoid ‘creating’ effects through tweaking analysis parameters. It uses slightly more complicated data structures, but I think it is much more effective for statistics and plotting (and certainly for time-frequency analyses). There are many ways to preprocess EEG data – EEGLAB/ERPLAB is a nice starting point for beginners and workshops/classes because it is easy to visualize steps and skip back and forth in the processing stream.
The first step in working with EEG data is preprocessing, which involves cleaning and formatting the raw data for analysis. This includes removing artifacts such as eye blinks or muscle movements, filtering out noise, and re-referencing the data to a common reference point. Preprocessing is essential for ensuring the accuracy and reliability of your analysis results.
2. Feature Extraction
Once the data has been preprocessed, the next step is feature extraction. This involves identifying specific patterns or characteristics in the EEG signals that are relevant to your research questions. Common features extracted from EEG data include frequency bands, event-related potentials, and spectral power densities.
3. Data Analysis
After feature extraction, the final step is data analysis. This involves applying statistical tests and machine learning algorithms to the extracted features to uncover patterns or relationships in the data. Common analysis techniques used with EEG data include time-frequency analysis, coherence analysis, and classification algorithms.
Working with EEG data can be complex and challenging, but with the right tools and techniques, researchers can gain valuable insights into brain function and cognitive processes. By following these key steps in preprocessing, feature extraction, and data analysis, researchers can unlock the full potential of EEG data in their studies.