PCA & multivariate signal processing, applied to neural data
4.6 (163 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,792 students enrolled

PCA & multivariate signal processing, applied to neural data

Learn and apply cutting-edge data analysis techniques for the age of "big data" in neuroscience (theory and MATLAB code)
4.6 (163 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,792 students enrolled
Created by Mike X Cohen
Last updated 6/2020
English [Auto-generated], Polish [Auto-generated], 1 more
  • Romanian [Auto-generated]
Current price: $13.99 Original price: $19.99 Discount: 30% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 10 hours on-demand video
  • 9 articles
  • 8 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Understand advanced linear algebra methods
  • Apply advanced linear algebra methods in MATLAB
  • Simulate multivariate data for testing analysis methods
  • Analyzing multivariate time series datasets
  • Appreciate the challenges neuroscientists are struggling with!
  • Learn about modern neuroscience data analysis
  • Some linear algebra background (or interest in learning!)
  • Some neuroscience background (or interest in learning!)
  • Some MATLAB programming experience (only to complete exercises)
  • Interest in learning applied linear algebra

What is this course all about?

Neuroscience (brain science) is changing -- new brain-imaging technologies are allowing increasingly huge data sets, but analyzing the resulting Big Data is one of the biggest struggles in modern neuroscience (if don't believe me, ask a neuroscientist!).

The increases in the number of simultaneously recorded data channels allows new discoveries about spatiotemporal structure in the brain, but also presents new challenges for data analyses. Because data are stored in matrices, algorithms developed in linear algebra are extremely useful. 

The purpose of this course is to teach you some matrix-based data analysis methods in neural time series data, with a focus on multivariate dimensionality reduction and source-separation methods. This includes covariance matrices, principal components analysis (PCA), generalized eigendecomposition (even better than PCA!), and independent components analysis (ICA). The course is mathematically rigorous but is approachable to individuals with no formal mathematics background. MATLAB is the primary numerical processing engine but the material is easily portable to Python or any other language. 

You should take this course if you are a...

  • neuroscience researcher who is looking for ways to analyze your multivariate data.

  • student who wants to be competitive for a neuroscience PhD or postdoc position.

  • non-neuroscientist who is interested in learning more about the big questions in modern brain science.

  • independent learner who wants to advance your linear algebra knowledge.

  • mathematician, engineer, or physicist who is curious about applied matrix decompositions in neuroscience.

  • person who wants to learn more about principal components analysis (PCA) and/or independent components analysis (ICA)

  • intrigued by the image that starts off the Course Preview and want to know what it means! (The answers are in this course!)

Unsure if this course is right for you?

I worked hard to make this course accessible to anyone with at least minimal linear algebra and programming background. But this course is not right for everyone. Check out the preview videos and feel free to contact me if you have any questions.

I look forward to seeing you in the course!

Who this course is for:
  • Anyone interested in next-generation neuroscience data analyses
  • Learners with interest in applied linear algebra to modern big-data challenges
  • Neuroscientists dealing with "big data"
  • Mathematicians, engineers, and physicists who are interested in learning about neuroscience data
Course content
Expand all 80 lectures 10:04:30
+ Introduction
6 lectures 36:04
MATLAB code for this section

Figure out if this course is right for you, and if so, how best to learn from this course.

Preview 06:44

Learn the general goals of neuroscience research, and why neuroscience is moving towards big multivariate datasets.

Preview 10:23

Definition of spatial filters, analogy to temporal filters, and different flavors of linear spatial filters.

What are linear spatial filters?

Learn the myriad advantages of spatial filters in multivariate neuroscience.

Why spatial filters are useful for neuroscience

Solidify your theoretical knowledge using MATLAB!

Using MATLAB in this course
+ Dimensions and sources
6 lectures 41:51

Know the several interpretations of the word “dimension,” and the definition used in this course.

Preview 05:05

Understand the different interpretations of “source,” know which definition is used in this course, and appreciate the importance of source separation.

The concept of “source” in measured signals

Mechanisms of source mixing, the idea and importance of unmixing, and some key source separation terminology.

Sources, mixing, and unmixing

Reducing dimensionality and separating sources are very different. This video explains why.

Dimension reduction vs. source separation

Know the difference between linear and nonlinear filters, and why linear filters are appropriate in many areas of neuroscience.

Linear vs. nonlinear filtering

Want to know whether you can apply source separation methods to your data? Watch this video to find out!

Data requirements for source separation
+ Creating and interpreting covariance matrices
8 lectures 01:17:18

Zip file with MATLAB code and data.

MATLAB code for this section

Learn about the concepts, assumptions, and representations of correlations and covariances.

Correlation and covariance: terms and matrices

Learn the element-wise and matrix equations for covariances. Have some pointers for what to look for when viewing covariance matrices.

Preview 18:17

See why covariance matrices must be symmetric.

Proof: Covariance matrices are symmetric

Create covariance matrices of simulated data.

MATLAB: covariance of simulated data

Create covariance matrices of real EEG data.

MATLAB: covariance with real data

The "quadratic form" provides a way of understanding and visualizing the beautiful geometry of a covariance matrix.

The quadratic form and the covariance surface

Create and visualize the quadratic form of a covariance matrix in MATLAB.

MATLAB: visualizing the quadratic form
+ Dimension reduction with PCA
14 lectures 01:25:56
MATLAB code for this section

Have step-by-step instructions for computing a PCA on multichannel data.

How to perform a principal components analysis

Follow the instructions to perform a PCA!

Exercise: PCA on non-phase-locked data

Build geometric intuition of PCA using dots and surfaces.

PCA intuition with scatter plots and covariance surfaces

Learn the math (linear algebra) that underlies the solution to PCA.

Finding PC weights with eigendecomposition

See a mathematical proof that all PCs (covariance eigenvectors) are orthogonal.

Proof of principal component orthogonality

Perform PCA of simulated EEG data.

MATLAB: PCA of simulated EEG data

Perform PCA of real EEG data.

MATLAB: PCA of real EEG data

Understand what happens when you don't mean-center data before computing a covariance matrix.

MATLAB: importance of mean-centering for PCA

Understand why singular value decomposition of a data matrix will give the same results as eigendecomposition of a covariance matrix.

Dimension reduction using SVD instead of eigendecomposition

Demonstration of the equivalence of SVD and eigendecomposition for PCA.

MATLAB: PCA via SVD and covariance

Learn how to use PCA to observe the state-space of a system.

PCA for state-space representation

Demonstration of 2D state-space analysis in real EEG data.

MATLAB: state-space representation via PCA

Learn why PCA is not good for source separation in multivariate neural datasets.

Preview 08:35
+ Source separation with GED
23 lectures 03:00:37

zip file containing MATLAB code and data.

MATLAB code for this section

Conceptual idea of why and how to generalize PCA for an appropriate and general source separation algorithm.

GED as an extension of PCA

Build geometric intuition of GED using the quadratic form of covariance matrices.

GED geometric intuition with covariance surfaces

Learn the math (linear algebra) that underlies the solution to GED-based source separation.

Finding weights with generalized eigendecomposition

Tips to ensure the covariance matrices are of high quality.

Evaluating and improving covariance quality

See the theory implemented in MATLAB.

MATLAB: Single trial covariance distances

Understand the origin of eigenvector sign indeterminancy, and how to "fix" the component sign.

Component sign uncertainty

Understand why the spatial filter is applied to the data, while the spatial pattern is interpreted.

Visualizing the spatial filter vs. spatial patterns

See how GED can separate two independent sources in simulated EEG data.

MATLAB: 2 components in simulated EEG data

A discussion of how to construct the two covariance matrices.

Constructing the S and R matrices

See GED in action in task-related real EEG data!

MATLAB: Task-relevant component in EEG

Use GED to create optimal spatial filters for narrowband spectral activity.

MATLAB: Spectral scanning in MEG and EEG

PCA and GED can be combined for compression/source separation, which is good for large-scale datasets.

Two-stage compression and source separation

Apply your knowledge about two-stage compression/separation to real EEG data!

Exercise: Two-stage source separation in real EEG data

Increase spatial precision by reducing large dimensions using (partial) whitening.

Preview 11:56

See the theory implemented in MATLAB.

MATLAB: Simulated data with and without ZCA

Apply your knowledge about ZCA and two-stage compression/separation to real EEG data!

Exercise: ZCA+two-stage separation on real EEG data

Consider the consequences of covariance non-stationarities on source separation results.

Source separation with nonstationary covariances

Demonstrate the effects of covariance changes on resulting source projection topographies.

MATLAB: Simulated EEG data with alternating dipoles

Learn the math and theory of shrinkage regularization.

Regularization: Theory, math, and intuition

Shrinkage regularization applied in MATLAB. 

MATLAB: Effects of regularization in real data

See a demonstration of GED and factor analysis (with default settings) on source separation.

MATLAB: GED vs. factor analysis

Secrets revealed!

Learn the secret of the course cover image!
+ Source separation for steady-state responses
5 lectures 46:44
MATLAB materials for this section

What happens in your brain when you turn on a strobe light?

The steady-state evoked potential

Learn about the key analysis motivations for a spatial filter for SSVEP.

Motivations for a spatial filter for the steady-state response

The RESS analysis pipeline in words and pictures.

RESS analysis pipeline

See RESS applied in action on real EEG data.

MATLAB: example with real EEG data
+ Independent components analysis (ICA)
5 lectures 01:03:48
MATLAB code for this section

Theoretical/conceptual overview of ICA

Overview of independent components analysis

See ICA in a simple example in a small dataset.

MATLAB: Data distributions and ICA

Independent components analysis in simulated EEG data.

MATLAB: ICA, PCA, GED on simulated data

ICs are supposed to be non-Gaussian. What do they really look like?!

MATLAB: Explore IC distributions in real data
+ Overfitting and inferential statistics
6 lectures 49:59
MATLAB code for this section

Learn what "overfitting" means and what implications it has for source separation (hint: it's not all bad!).

What is overfitting and why is it inappropriate?

Use overfitting in an unbiased way by creating and testing the spatial filters in different ways.

Unbiased filter creation and application

Avoid overfitting biases by applying the spatial filters to different data than from which they were created.

Cross-validation (in- vs. out-of-sample testing)

See an example of cross-validation on real EEG data. 

MATLAB: Cross-validation in real data

Learn about inferential statistics using permutation testing to compute an effect size (z) and p-value of eigenvalues.

Permutation testing
+ Big questions in multivariate neuroscience
6 lectures 21:27
MATLAB code for this section

Why does the relatively simple math of eigendecomposition work so well in neural data? What can we infer about brain function from this?

Math, physiology, and anatomy

Source separation methods can reveal either anatomically restricted "dipoles" or distributed networks.

Functional networks vs. volume conduction

Source separation can reveal individual differences. Can these differences be leveraged for scientific or clinical insights?

Interpreting individual differences

There are many many parameters and options for source separation methods. Which should you use?? Also: here's a reading list to learn more!

A surfeit of source separation selections (and a reading list!)

The seductive allure and dangerous pitfalls of reductionism.

Is reducing dimensionality always good?