Natural Language Processing

Text and Speech analysis projects with Huggingface transformers, Tensorflow hub , Textblob and NLP Libraries .
Free tutorial
Rating: 4.1 out of 5 (16 ratings)
1,219 students
52min of on-demand video
English [Auto]

Natural Language Processing : Token, Tagging and Stemming.
NLP Modelling and Testing.
Transformers-Hugging face.
Text Analysis with Natural language Processing.
Speech Analysis with Natural language Processing.


  • No Programming experience required , Prior knowledge of python appreciated.


In this course you will learn about natural language processing basics , How to develop trained and pre-trained model . You will also learn how to use NLP libraries such as Huggingface transformers, Tensorflow Hub and Textblob.

You will also start developing basic models for text and speech analysis.

Natural Language Processing

Up to the 1980s, most natural language processing systems were based on complex sets of hand-written rules. Starting in the late 1980s, however, there was a revolution in natural language processing with the introduction of machine learning algorithms for language processing. This was due to both the steady increase in computational power due to Moore's law and the gradual lessening of the dominance of Chomskyan theories of linguistics (e.g. transformational grammar), whose theoretical underpinnings discouraged the sort of corpus linguistics that underlies the machine-learning approach to language processing.

Neural networks

Popular techniques include the use of word embeddings to get semantic properties of words, and an increase in end-to-end learning of a higher-level task (e.g., question answering) instead of relying on a pipeline of separate intermediate tasks (e.g., part-of-speech tagging and dependency parsing). In some areas, this shift has entailed substantial changes in how NLP systems are designed, such that deep neural network-based approaches may be viewed as a new paradigm distinct from statistical natural language processing. For instance, the term neural machine translation (NMT) emphasizes the fact that deep learning-based approaches to machine translation directly learn sequence-to-sequence transformations, obviating the need for intermediate steps such as word alignment and language modeling that was used in statistical machine translation (SMT).

Who this course is for:

  • Learners curious about Machine learning and Natural Language Processing.


Computer Scientist
Abhinav Raj
  • 3.6 Instructor Rating
  • 403 Reviews
  • 21,019 Students
  • 37 Courses

Computer Science & entrepreneurial leadership.

I have completed around 14+ brand representations in founder, executive, leadership, developer, and senior volunteer roles including non-profits. Product engineering is one of my core specializations but I am also an expert in Agile and DevOps. These few years I have also delved into blockchain and metaverse.

Udemy has been a special part of this journey.

Top companies trust Udemy

Get your team access to Udemy's top 25,000+ courses