Buying for a Team? Gift This Course
Wishlisted Wishlist

Please confirm that you want to add From 0 to 1: Machine Learning, NLP & Python-Cut to the Chase to your Wishlist.

Add to Wishlist

From 0 to 1: Machine Learning, NLP & Python-Cut to the Chase

A down-to-earth, shy but confident take on machine learning techniques that you can put to work today
4.1 (454 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
4,024 students enrolled
Created by Loony Corn
Last updated 9/2016
English English
$10 $50 80% off
2 days left at this price!
30-Day Money-Back Guarantee
Includes:
  • 20.5 hours on-demand video
  • 111 Supplemental Resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Have a coupon?
What Will I Learn?
Identify situations that call for the use of Machine Learning
Understand which type of Machine learning problem you are solving and choose the appropriate solution
Use Machine Learning and Natural Language processing to solve problems like text classification, text summarization in Python
View Curriculum
Requirements
  • No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory. Working knowledge of Python would be helpful if you want to run the source code that is provided.
Description

Prerequisites: No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory. Working knowledge of Python would be helpful if you want to run the source code that is provided.

Taught by a Stanford-educated, ex-Googler and an IIT, IIM - educated ex-Flipkart lead analyst. This team has decades of practical experience in quant trading, analytics and e-commerce.

This course is a down-to-earth, shy but confident take on machine learning techniques that you can put to work today

Let’s parse that.

The course is down-to-earth : it makes everything as simple as possible - but not simpler

The course is shy but confident : It is authoritative, drawn from decades of practical experience -but shies away from needlessly complicating stuff.

You can put ML to work today : If Machine Learning is a car, this car will have you driving today. It won't tell you what the carburetor is.

The course is very visual : most of the techniques are explained with the help of animations to help you understand better.

This course is practical as well : There are hundreds of lines of source code with comments that can be used directly to implement natural language processing and machine learning for text summarization, text classification in Python.

The course is also quirky. The examples are irreverent. Lots of little touches: repetition, zooming out so we remember the big picture, active learning with plenty of quizzes. There’s also a peppy soundtrack, and art - all shown by studies to improve cognition and recall.

What's Covered:

Machine Learning:

Supervised/Unsupervised learning, Classification, Clustering, Association Detection, Anomaly Detection, Dimensionality Reduction, Regression.

Naive Bayes, K-nearest neighbours, Support Vector Machines, Artificial Neural Networks, K-means, Hierarchical clustering, Principal Components Analysis, Linear regression, Logistics regression, Random variables, Bayes theorem, Bias-variance tradeoff

Natural Language Processing with Python:

Corpora, stopwords, sentence and word parsing, auto-summarization, sentiment analysis (as a special case of classification), TF-IDF, Document Distance, Text summarization, Text classification with Naive Bayes and K-Nearest Neighbours and Clustering with K-Means

Sentiment Analysis: 

Why it's useful, Approaches to solving - Rule-Based , ML-Based , Training , Feature Extraction, Sentiment Lexicons, Regular Expressions, Twitter API, Sentiment Analysis of Tweets with Python

Mitigating Overfitting with Ensemble Learning:

Decision trees and decision tree learning, Overfitting in decision trees, Techniques to mitigate overfitting (cross validation, regularization), Ensemble learning and Random forests

Recommendations:  Content based filtering, Collaborative filtering and Association Rules learning

Get started with Deep learning: Apply Multi-layer perceptrons to the MNIST Digit recognition problem

A Note on Python: The code-alongs in this class all use Python 2.7. Source code (with copious amounts of comments) is attached as a resource with all the code-alongs. The source code has been provided for both Python 2 and Python 3 wherever possible.

Mail us about anything - anything! - and we will always reply :-)

Who is the target audience?
  • Yep! Analytics professionals, modelers, big data professionals who haven't had exposure to machine learning
  • Yep! Engineers who want to understand or learn machine learning and apply it to problems they are solving
  • Yep! Product managers who want to have intelligent conversations with data scientists and engineers about machine learning
  • Yep! Tech executives and investors who are interested in big data, machine learning or natural language processing
  • Yep! MBA graduates or business professionals who are looking to move to a heavily quantitative role
Students Who Viewed This Course Also Viewed
Curriculum For This Course
Expand All 87 Lectures Collapse All 87 Lectures 20:32:10
+
Introduction
1 Lecture 03:17

We - the course instructors - start with introductions. We are a team that has studied at Stanford, IIT Madras, IIM Ahmedabad and spent several years working in top tech companies, including Google and Flipkart.

Next, we talk about the target audience for this course: Analytics professionals, modelers and big data professionals certainly, but also Engineers, Product managers, Tech Executives and Investors, or anyone who has some curiosity about machine learning.

If Machine Learning is a car, this class will teach you how to drive. By the end of this class, students will be able to: spot situations where machine learning can be used, and deploy the appropriate solutions. Product managers and executives will learn enough of the 'how' to be able intelligently converse with their data science counterparts, without being constrained by it.

This course is practical as well : There are hundreds of lines of source code with comments that can be used directly to implement natural language processing and machine learning for text summarization, text classification in Python.

Preview 03:17
+
Jump right in : Machine learning for Spam detection
4 Lectures 01:08:02

Machine learning is quite the buzzword these days. While it's been around for a long time, today its applications are wide and far-reaching - from computer science to social science, quant trading and even genetics. From the outside, it seems like a very abstract science that is heavy on the math and tough to visualize. But it is not at all rocket science. Machine learning is like any other science - if you approach it from first principles and visualize what is happening, you will find that it is not that hard. So, let's get right into it, we will take an example and see what Machine learning is and why it is so useful.

Preview 16:31

Machine learning usually involves a lot of terms that sound really obscure. We'll see a real life implementation of a machine learning algorithm (Naive Bayes) and by end of it you should be able to speak some of the language of ML with confidence.

Plunging In - Machine Learning Approaches to Spam Detection
17:01

We have gotten our feet wet and seen the implementation of one ML solution to spam detection - let's venture a little further and see some other ways to solve the same problem. We'll see how K-Nearest Neighbors and Support Vector machines can be used to solve spam detection.

Spam Detection with Machine Learning Continued
17:04

So far we have been slowly getting comfortable with machine learning - we took one example and saw a few different approaches. That was just the the tip of the iceberg - this class is an aerial maneuver, we will scout ahead and see what are the different classes of problems that Machine Learning can solve and that we will cover in this class.

Get the Lay of the Land : Types of Machine Learning Problems
17:26
+
Naive Bayes Classifier
4 Lectures 01:01:38
Many popular machine learning techniques are probabilistic in nature and having some working knowledge helps. We'll cover random variables, probability distributions and the normal distribution.
Random Variables
20:10

We have been learning some fundamentals that will help us with probabilistic concepts in Machine Learning. In this class, we will learn about conditional probability and Bayes theorem which is the foundation of many ML techniques.
Bayes Theorem
18:36

Naive Bayes Classifier is a probabilistic classifier. We have built the foundation to understand what goes on under the hood - let's understand how the Naive Bayes classifier uses the Bayes theorem
Naive Bayes Classifier
08:49

We will see how the Naive Bayes classifier can be used with an example.
Preview 14:03
+
K-Nearest Neighbors
2 Lectures 27:56
Let's understand the k-Nearest Neighbors setup with a visual representation of how the algorithm works.
K-Nearest Neighbors
13:09

There are few wrinkles in k-Nearest Neighbors. These are just the things to keep in mind if and when you decide to implement it.
K-Nearest Neighbors : A few wrinkles
14:47
+
Support Vector Machines
2 Lectures 24:39

We have been talking about different classifier algorithms. We'll learn about Support Vector Machines which are linear classifiers.

Support Vector Machines Introduced
08:16

Support Vector Machines algorithm can be framed as an optimization problem. The kernel trick can be used along with SVM to perform non-linear classification.
Support Vector Machines : Maximum Margin Hyperplane and Kernel Trick
16:23
+
Clustering as a form of Unsupervised learning
2 Lectures 32:49
Clustering helps us understand what are the patterns in a large set of data that we don't know much about. It is a form of unsupervised learning.
Preview 19:07

K-Means and DBSCAN are 2 very popular clustering algorithms. How do they work and what are the key considerations?
Clustering : K-Means and DBSCAN
13:42
+
Association Detection
1 Lecture 09:12
It is all about finding relationships in the data - sometimes there are relationships that you would not intuitively expect to find. It is pretty powerful - so let's take a peek at what it does.
Association Rules Learning
09:12
+
Dimensionality Reduction
2 Lectures 29:15

Data that you are working can be noisy or garbled or difficult to make sense of. It can be so complicated that its difficult to process efficiently. Dimensionality reduction to the rescue - it cleans up the noise and shows you a clear picture. Getting rid of unnecessary features makes the computation simpler.

Dimensionality Reduction
10:22

PCA is one of the most famous Dimensionality Reduction techniques. When you have data with a lot of variables and confusing interactions, PCA clears the air and finds the underlying causes.
Principal Component Analysis
18:53
+
Artificial Neural Networks
1 Lecture 11:18

Artificial Neural Networks are much misunderstood because of the name. We will see the Perceptron (a prototypical example of ANNs) and how it is analogous to Support Vector Machine

Artificial Neural Networks:Perceptrons Introduced
11:18
+
Regression as a form of supervised learning
2 Lectures 24:07
Regression can be used to predict the value of a variable, given some predictor variables. We'll see an example to understand its use and cover two popular methods : Linear and Logistic regression
Regression Introduced : Linear and Logistic Regression
13:54

In this class, we will talk about some trade-offs which we have to be aware of when we choose our training data and model.
Preview 10:13
9 More Sections
About the Instructor
4.3 Average rating
2,985 Reviews
20,746 Students
61 Courses
A 4-person team;ex-Google; Stanford, IIM Ahmedabad, IIT

Loonycorn is us, Janani Ravi, Vitthal Srinivasan, Swetha Kolalapudi and Navdeep Singh. Between the four of us, we have studied at Stanford, IIM Ahmedabad, the IITs and have spent years (decades, actually) working in tech, in the Bay Area, New York, Singapore and Bangalore.

Janani: 7 years at Google (New York, Singapore); Studied at Stanford; also worked at Flipkart and Microsoft

Vitthal: Also Google (Singapore) and studied at Stanford; Flipkart, Credit Suisse and INSEAD too

Swetha: Early Flipkart employee, IIM Ahmedabad and IIT Madras alum

Navdeep: longtime Flipkart employee too, and IIT Guwahati alum

We think we might have hit upon a neat way of teaching complicated tech courses in a funny, practical, engaging way, which is why we are so excited to be here on Udemy!

We hope you will try our offerings, and think you'll like them :-)

Report Abuse