PCA & multivariate signal processing, applied to neural data
4.4 (178 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,922 students enrolled

PCA & multivariate signal processing, applied to neural data

Learn and apply cutting-edge data analysis techniques for the age of "big data" in neuroscience (theory and MATLAB code)
4.4 (178 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,922 students enrolled
Created by Mike X Cohen
Last updated 8/2020
English
English [Auto], Polish [Auto], 1 more
  • Romanian [Auto]
Current price: $13.99 Original price: $19.99 Discount: 30% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 10 hours on-demand video
  • 9 articles
  • 8 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Understand advanced linear algebra methods
  • Apply advanced linear algebra methods in MATLAB
  • Simulate multivariate data for testing analysis methods
  • Analyzing multivariate time series datasets
  • Appreciate the challenges neuroscientists are struggling with!
  • Learn about modern neuroscience data analysis
Requirements
  • Some linear algebra background (or interest in learning!)
  • Some neuroscience background (or interest in learning!)
  • Some MATLAB programming experience (only to complete exercises)
  • Interest in learning applied linear algebra
Description

What is this course all about?

Neuroscience (brain science) is changing -- new brain-imaging technologies are allowing increasingly huge data sets, but analyzing the resulting Big Data is one of the biggest struggles in modern neuroscience (if don't believe me, ask a neuroscientist!).

The increases in the number of simultaneously recorded data channels allows new discoveries about spatiotemporal structure in the brain, but also presents new challenges for data analyses. Because data are stored in matrices, algorithms developed in linear algebra are extremely useful. 

The purpose of this course is to teach you some matrix-based data analysis methods in neural time series data, with a focus on multivariate dimensionality reduction and source-separation methods. This includes covariance matrices, principal components analysis (PCA), generalized eigendecomposition (even better than PCA!), and independent components analysis (ICA). The course is mathematically rigorous but is approachable to individuals with no formal mathematics background. MATLAB is the primary numerical processing engine but the material is easily portable to Python or any other language. 

You should take this course if you are a...

  • neuroscience researcher who is looking for ways to analyze your multivariate data.

  • student who wants to be competitive for a neuroscience PhD or postdoc position.

  • non-neuroscientist who is interested in learning more about the big questions in modern brain science.

  • independent learner who wants to advance your linear algebra knowledge.

  • mathematician, engineer, or physicist who is curious about applied matrix decompositions in neuroscience.

  • person who wants to learn more about principal components analysis (PCA) and/or independent components analysis (ICA)

  • intrigued by the image that starts off the Course Preview and want to know what it means! (The answers are in this course!)


Unsure if this course is right for you?

I worked hard to make this course accessible to anyone with at least minimal linear algebra and programming background. But this course is not right for everyone. Check out the preview videos and feel free to contact me if you have any questions.

I look forward to seeing you in the course!

Who this course is for:
  • Anyone interested in next-generation neuroscience data analyses
  • Learners with interest in applied linear algebra to modern big-data challenges
  • Neuroscientists dealing with "big data"
  • Mathematicians, engineers, and physicists who are interested in learning about neuroscience data
Course content
Expand all 80 lectures 10:04:34
+ Introduction
6 lectures 36:04
MATLAB code for this section
00:01

Figure out if this course is right for you, and if so, how best to learn from this course.

Preview 06:44

Learn the general goals of neuroscience research, and why neuroscience is moving towards big multivariate datasets.

Preview 10:23

Definition of spatial filters, analogy to temporal filters, and different flavors of linear spatial filters.

What are linear spatial filters?
07:52

Learn the myriad advantages of spatial filters in multivariate neuroscience.

Why spatial filters are useful for neuroscience
07:48

Solidify your theoretical knowledge using MATLAB!

Using MATLAB in this course
03:16
+ Dimensions and sources
6 lectures 41:51

Know the several interpretations of the word “dimension,” and the definition used in this course.

Preview 05:05

Understand the different interpretations of “source,” know which definition is used in this course, and appreciate the importance of source separation.

The concept of “source” in measured signals
06:46

Mechanisms of source mixing, the idea and importance of unmixing, and some key source separation terminology.

Sources, mixing, and unmixing
11:16

Reducing dimensionality and separating sources are very different. This video explains why.

Dimension reduction vs. source separation
06:24

Know the difference between linear and nonlinear filters, and why linear filters are appropriate in many areas of neuroscience.

Linear vs. nonlinear filtering
07:12

Want to know whether you can apply source separation methods to your data? Watch this video to find out!

Data requirements for source separation
05:08
+ Creating and interpreting covariance matrices
8 lectures 01:17:18

Zip file with MATLAB code and data.

MATLAB code for this section
00:01

Learn about the concepts, assumptions, and representations of correlations and covariances.

Correlation and covariance: terms and matrices
11:53

Learn the element-wise and matrix equations for covariances. Have some pointers for what to look for when viewing covariance matrices.

Preview 18:17

See why covariance matrices must be symmetric.

Proof: Covariance matrices are symmetric
03:51

Create covariance matrices of simulated data.

MATLAB: covariance of simulated data
15:12

Create covariance matrices of real EEG data.

MATLAB: covariance with real data
06:24

The "quadratic form" provides a way of understanding and visualizing the beautiful geometry of a covariance matrix.

The quadratic form and the covariance surface
15:25

Create and visualize the quadratic form of a covariance matrix in MATLAB.

MATLAB: visualizing the quadratic form
06:15
+ Dimension reduction with PCA
14 lectures 01:25:56
MATLAB code for this section
00:01

Have step-by-step instructions for computing a PCA on multichannel data.

How to perform a principal components analysis
08:01

Follow the instructions to perform a PCA!

Exercise: PCA on non-phase-locked data
03:17

Build geometric intuition of PCA using dots and surfaces.

PCA intuition with scatter plots and covariance surfaces
06:15

Learn the math (linear algebra) that underlies the solution to PCA.

Finding PC weights with eigendecomposition
07:12

See a mathematical proof that all PCs (covariance eigenvectors) are orthogonal.

Proof of principal component orthogonality
11:57

Perform PCA of simulated EEG data.

MATLAB: PCA of simulated EEG data
12:40

Perform PCA of real EEG data.

MATLAB: PCA of real EEG data
Processing..

Understand what happens when you don't mean-center data before computing a covariance matrix.

MATLAB: importance of mean-centering for PCA
04:23

Understand why singular value decomposition of a data matrix will give the same results as eigendecomposition of a covariance matrix.

Dimension reduction using SVD instead of eigendecomposition
08:44

Demonstration of the equivalence of SVD and eigendecomposition for PCA.

MATLAB: PCA via SVD and covariance
04:01

Learn how to use PCA to observe the state-space of a system.

PCA for state-space representation
06:26

Demonstration of 2D state-space analysis in real EEG data.

MATLAB: state-space representation via PCA
04:24

Learn why PCA is not good for source separation in multivariate neural datasets.

Preview 08:35
+ Source separation with GED
23 lectures 03:00:37

zip file containing MATLAB code and data.

MATLAB code for this section
00:01

Conceptual idea of why and how to generalize PCA for an appropriate and general source separation algorithm.

GED as an extension of PCA
07:28

Build geometric intuition of GED using the quadratic form of covariance matrices.

GED geometric intuition with covariance surfaces
03:42

Learn the math (linear algebra) that underlies the solution to GED-based source separation.

Finding weights with generalized eigendecomposition
09:14

Tips to ensure the covariance matrices are of high quality.

Evaluating and improving covariance quality
09:59

See the theory implemented in MATLAB.

MATLAB: Single trial covariance distances
06:02

Understand the origin of eigenvector sign indeterminancy, and how to "fix" the component sign.

Component sign uncertainty
07:00

Understand why the spatial filter is applied to the data, while the spatial pattern is interpreted.

Visualizing the spatial filter vs. spatial patterns
04:51

See how GED can separate two independent sources in simulated EEG data.

MATLAB: 2 components in simulated EEG data
14:21

A discussion of how to construct the two covariance matrices.

Constructing the S and R matrices
08:14

See GED in action in task-related real EEG data!

MATLAB: Task-relevant component in EEG
13:41

Use GED to create optimal spatial filters for narrowband spectral activity.

MATLAB: Spectral scanning in MEG and EEG
10:36

PCA and GED can be combined for compression/source separation, which is good for large-scale datasets.

Two-stage compression and source separation
05:53

Apply your knowledge about two-stage compression/separation to real EEG data!

Exercise: Two-stage source separation in real EEG data
05:29

Increase spatial precision by reducing large dimensions using (partial) whitening.

Preview 11:56

See the theory implemented in MATLAB.

MATLAB: Simulated data with and without ZCA
05:05

Apply your knowledge about ZCA and two-stage compression/separation to real EEG data!

Exercise: ZCA+two-stage separation on real EEG data
03:42

Consider the consequences of covariance non-stationarities on source separation results.

Source separation with nonstationary covariances
14:44

Demonstrate the effects of covariance changes on resulting source projection topographies.

MATLAB: Simulated EEG data with alternating dipoles
09:37

Learn the math and theory of shrinkage regularization.

Regularization: Theory, math, and intuition
11:06

Shrinkage regularization applied in MATLAB. 

MATLAB: Effects of regularization in real data
06:13

See a demonstration of GED and factor analysis (with default settings) on source separation.

MATLAB: GED vs. factor analysis
07:12

Secrets revealed!

Learn the secret of the course cover image!
04:31
+ Source separation for steady-state responses
5 lectures 46:44
MATLAB materials for this section
00:02

What happens in your brain when you turn on a strobe light?

The steady-state evoked potential
08:35

Learn about the key analysis motivations for a spatial filter for SSVEP.

Motivations for a spatial filter for the steady-state response
09:09

The RESS analysis pipeline in words and pictures.

RESS analysis pipeline
12:18

See RESS applied in action on real EEG data.

MATLAB: example with real EEG data
16:40
+ Independent components analysis (ICA)
5 lectures 01:03:48
MATLAB code for this section
00:01

Theoretical/conceptual overview of ICA

Overview of independent components analysis
17:27

See ICA in a simple example in a small dataset.

MATLAB: Data distributions and ICA
16:09

Independent components analysis in simulated EEG data.

MATLAB: ICA, PCA, GED on simulated data
18:32

ICs are supposed to be non-Gaussian. What do they really look like?!

MATLAB: Explore IC distributions in real data
11:39
+ Overfitting and inferential statistics
6 lectures 49:59
MATLAB code for this section
00:02

Learn what "overfitting" means and what implications it has for source separation (hint: it's not all bad!).

What is overfitting and why is it inappropriate?
10:33

Use overfitting in an unbiased way by creating and testing the spatial filters in different ways.

Unbiased filter creation and application
10:21

Avoid overfitting biases by applying the spatial filters to different data than from which they were created.

Cross-validation (in- vs. out-of-sample testing)
05:36

See an example of cross-validation on real EEG data. 

MATLAB: Cross-validation in real data
11:57

Learn about inferential statistics using permutation testing to compute an effect size (z) and p-value of eigenvalues.

Permutation testing
11:30
+ Big questions in multivariate neuroscience
6 lectures 21:27
MATLAB code for this section
00:01

Why does the relatively simple math of eigendecomposition work so well in neural data? What can we infer about brain function from this?

Math, physiology, and anatomy
04:11

Source separation methods can reveal either anatomically restricted "dipoles" or distributed networks.

Functional networks vs. volume conduction
04:00

Source separation can reveal individual differences. Can these differences be leveraged for scientific or clinical insights?

Interpreting individual differences
02:59

There are many many parameters and options for source separation methods. Which should you use?? Also: here's a reading list to learn more!

A surfeit of source separation selections (and a reading list!)
03:27

The seductive allure and dangerous pitfalls of reductionism.

Is reducing dimensionality always good?
06:49