Word2Vec: Build Semantic Recommender System with TensorFlow
What you'll learn
- Building and Training a Word2vec Model with Python TensorFlow
- Semantic Recommender System - Practical Project to Semantically Suggest Names
- Source Code *.py Files of All Lectures
- English Captions for All Lectures
- Q&A board to send your questions and get them answered quickly
- Python Level: Intermediate. This Word2Vec tutorial assumes that you already know the basics of writing simple Python programs and that you are generally familiar with Python's core features (data structures, file handling, functions, classes, modules, common library modules, etc.).
- Python 2.7 or Python 3.4, 3.5, or 3.6. Tensorflow is not officially supporting Python 3.7.
- Our trainees are positive and willing to learn. They practice lessons and send their questions to the Q&A section of the course, and we expect new trainees to have the same spirit.
In this Word2Vec tutorial, you will learn how to train a Word2Vec Python model and use it to semantically suggest names based on one or even two given names.
This Word2Vec tutorial is meant to highlight the interesting, substantive parts of building a word2vec Python model with TensorFlow.
Word2vec is a group of related models that are used to produce Word Embeddings. Embedding vectors created using the Word2vec algorithm have many advantages compared to earlier algorithms such as latent semantic analysis.
Word embedding is one of the most popular representation of document vocabulary. It is capable of capturing context of a word in a document, semantic and syntactic similarity, relation with other words, etc. Word Embeddings are vector representations of a particular word.
The best way to understand an algorithm is to implement it. So, in this course you will learn Word Embeddings by implementing it in the Python library, TensorFlow.
Word2Vec is one of the most popular techniques to learn word embeddings using shallow neural network. Word2vec is a particularly computationally-efficient predictive model for learning word embeddings from raw text.
In this Word2Vec tutorial, you will learn The idea behind Word2Vec:
Take a 3 layer neural network. (1 input layer + 1 hidden layer + 1 output layer)
Feed it a word and train it to predict its neighbouring word.
Remove the last (output layer) and keep the input and hidden layer.
Now, input a word from within the vocabulary. The output given at the hidden layer is the ‘word embedding’ of the input word.
In this Word2Vec tutorial we are going to do all steps of building and training a Word2vec Python model (including pre-processing, tokenizing, batching, structuring the Word2Vec Python model and of course training it) using Python TensorFlow. Finally, we are going to use our trained Word2Vec Python model to semantically suggest names based on one or even two given names.
Who this course is for:
- This Word2Vec tutorial is meant for those who are familiar with Python and want to learn how to use TensorFlow to implement Word2Vec Word Embeddings, building a real-life Semantic Recommendation System.
GoTrained is an e-learning academy aiming at creating useful content in different languages and it concentrates on technology and management.
We adopt a special approach for selecting content we provide; we mainly focus on skills that are frequently requested by clients and jobs while there are only few videos that cover them. We also try to build video series to cover not only the basics, but also the advanced areas.
Trying to reach a higher level of complexity with helping the world live better.
A Data scientist with development experience and marketing interest. I dream to help the marketing side of high-scale companies with data science.
Also a server administrator when needed.
Reading non-fiction books by: Dan Ariely, Seth Godin, Nassim Taleb, Kevin Kelly, Mihaly Csikszentmihalyi,...
Sometimes a blogger