TF Regression Exercise

Jose Portilla
A free video tutorial from Jose Portilla
Head of Data Science, Pierian Data Inc.
4.6 instructor rating • 30 courses • 1,986,782 students

Learn more from the full course

Complete Guide to TensorFlow for Deep Learning with Python

Learn how to use Google's Deep Learning Framework - TensorFlow with Python! Solve problems with cutting edge techniques!

14:07:23 of on-demand video • Updated April 2020

  • Understand how Neural Networks Work
  • Build your own Neural Network from Scratch with Python
  • Use TensorFlow for Classification and Regression Tasks
  • Use TensorFlow for Image Classification with Convolutional Neural Networks
  • Use TensorFlow for Time Series Analysis with Recurrent Neural Networks
  • Use TensorFlow for solving Unsupervised Learning Problems with AutoEncoders
  • Learn how to conduct Reinforcement Learning with OpenAI Gym
  • Create Generative Adversarial Networks with TensorFlow
  • Become a Deep Learning Guru!
English [Auto] Welcome everyone the tensor flow regression exercise lecture. So it's time to test your new skills and you'll be creating a model that attempts to predict housing prices using the 10th floor estimate or API in this lecture we're going to review the exercise notebook and explain what you need to do as an option. If it's more geared toward your learning style to coatl long lectures you can just skip this lecture and go to the next lecture which is a solution's cut a long walk there. For now let's open up the exercise notebook and tell you what you need to do. OK so here you have the regression exercise notebook open and the dataset you're going to be using is California housing data from a 1990 census. And essentially each sample is an aggregate block group containing around 4500 individuals living in some sort of geographically compact area in California. And there's a link here in case you want to figure out more information about the data. But I've actually already saved that for you in our zip file as Cal housing clean CXVII. So are you going to do is once you have that data you're going to import it using Pandurs and then also want you to separate it into training 70 percent and testing 30 percent. And this is what the data should look like and you can go ahead and do it describe it to kind of get a better idea of the statistical information of the data but once you'd have the train split ready to go you're going to scale just to feature data. So we're going to use sikat learn pre-processing to create a min max scaler for the feature data and you going to fit that scaler only to the training data. Basically as you just go along this notebook follow the instructions in bold and you're going to transform x. Test an X train. But remember when you're fitting the scaler you only want to fit it towards the training data because you don't want to assume you're going to have information about future test data. Then once you have that skilled X test data and skilled X trained data you can use that along with PD data frame to create recreate two data frames of the scaled versions of those data sets. Because remember minimax scalar is just going to return an umpire res then you're going to create feature columns so you'll create the necessary T.F. that feature column objects for the estimator. They should all just be treated as continuous numeric columns just like that in the lectures. Then you'll create an input function for the estimator object and then you create the actual estimator object. So you're going to create a densely connected neural network regress or model. And I want you to play around the hidden units to see what works best. But as a default you should probably choose three layers with each layer having as many neurons as there are features so if we come back up here we can see that there's 1 2 3 4 5 6 features. So maybe have six six six in your list here for the sexual estimator model. So just a list of six come a six come to six. They don't want to train the model for a thousand steps and then later you should come back to it and train it for many more steps maybe try 10000 or 20000 and see how much that improves the model train the model then you'll create a prediction input function and they'll use the Predict method off your estimator to create a list of predictions and then you can calculate the route mean squirt air off those predictions. So you should be able to get around 100000 r m s e value once you've ran it. I would say around 10000 steps or 20000 steps. OK. Best of luck to everyone. And I will see in the next lecture where we walk through the solutions.