Time Series Exercise Overview

Jose Portilla
A free video tutorial from Jose Portilla
Head of Data Science, Pierian Data Inc.
4.6 instructor rating • 32 courses • 2,254,813 students

Learn more from the full course

Complete Guide to TensorFlow for Deep Learning with Python

Learn how to use Google's Deep Learning Framework - TensorFlow with Python! Solve problems with cutting edge techniques!

14:07:23 of on-demand video • Updated April 2020

  • Understand how Neural Networks Work
  • Build your own Neural Network from Scratch with Python
  • Use TensorFlow for Classification and Regression Tasks
  • Use TensorFlow for Image Classification with Convolutional Neural Networks
  • Use TensorFlow for Time Series Analysis with Recurrent Neural Networks
  • Use TensorFlow for solving Unsupervised Learning Problems with AutoEncoders
  • Learn how to conduct Reinforcement Learning with OpenAI Gym
  • Create Generative Adversarial Networks with TensorFlow
  • Become a Deep Learning Guru!
English [Auto] Welcome back everyone in this lecture. We're going to do a quick overview of the time series exercise and the next lecture will go a long face solutions video. So let's hop over to your recurrent neural networks folder and show you how you can get the exercise notebook. All right. So under the recurrent neural networks folder you should see a file called Time series IPY and B so it's a notebook file as well as a solution's file for the time series exercise. And if you scroll down you'll also notice that there's a monthly milk production CXXVI file in that same folder. That's the file We're actually going to be using. So when you click on that exercise notebook you should get something that looks like this. The time series exercise. And as always. All you really have to do is follow along with the instructions in bold and if you ever get stuck on something just go ahead and wash the solutions video where you'll get an explanation with a code a lecture. Well we're going to be doing is trying to create a recurrent role that work model that is able to basically successfully predict monthly milk production based off a real data set. So here's a link to the actual data set. Remember it's already been downloaded for you but in case you want another source you can click on that link. It's basically just milk production per month in units of pounds per cow from January 1962 to December 1975. Again this is an older data set but the reason I really like using it is because as you scroll down which you'll eventually plotted out. You'll notice that it's a pretty obvious visual trend here. So you get this upward trend. And there's definitely a seasonality to it. So hopefully it should be really obvious whether or not your network is doing a good job of generating a time series for it so you can end up doing here. You did the usual import panel those use panels to read and see as we file check out the head of the data frame. You'll notice here that this is actually still not a date time index. So you're going to convert it to a date time index by using the following command. And then from there you'll be able to easily plot out this milk production plot once you formatted the data correctly. The next step is to perform a train test split. So we're going to attempt to predict a year's worth of data. So 12 months or 12 steps into the future in order to successfully do a train test split you're not going to be doing a random train test split because for a time series analysis that doesn't really convey what you're trying to actually accomplish what we want to do is feed in everything into our network except the last year of training data. And then when we actually test our network we're going to test it against that last year and see how well our network can predict. The year we do know versus the actual prediction cycle. So again we're going to do is take all the data except for that very last year and treat that as our training set. And then the last 12 months the data that's going to be our test set to see how well our model can actually create one of these cycles as a prediction and then hopefully if it performs well then we be able to use it for further years like 1976 or 1977 etcetera. Now there is definitely a limit to how far into the future you can accurately predict. But for now we'll use this last month of test data that we do know as our basically evaluation. So are you going to perform a test train split again you're going to do this using indexing not using random test train test split. So follow the instructions here. The last 12 months of data that's going to be the test set everything else is for training then we're going to scale that data. We didn't have to do that in the previous lectures because we were just using sign of x but in this case and this is real data we should use some pre-processing to do the minimax scaler on it. And remember you're only going to do that transform on the training data and then transform the test data. You should also fit on the test data. Otherwise you're assuming you would know about future behavior. So again really important points here. The train split is different than we are typically doing because it's time series as well as the fit transform then you're going to create a batch function. So you don't need a function that's going to feed batches of training data. So we're going to do several things. You're going to need to follow these steps step one step two and step three. And if this is pretty hard for you you can feel free to reference the solutions but this is heavily based off of what we just did in the previous lectures. So you can use those as references for you. And there's essentially instructions here for you to follow for each of these three steps you're going to create a function that basically takes in that training data takes in that batch size takes in a number of steps and feeds it back out that you're going to set up your art and models your import tensor flow. You're going to have constants so you can find these constants in a cell. I have things like the learning rate already laid out for you. Number of neurons and the number of iterations for training etc.. But again these are the ones I use the my solution. You can definitely play around with some of these values. It is to say that this learning rate is the learning we should use. You should definitely play around see if larger or smaller learning rates work better for this particular dataset. And again it's also going to depend on what type of cells you decide to eventually use. Then you'll go ahead and create placeholders for x and y and then you'll create the recurrent neural network layer or your cell layer. You really have complete freedom over this. I really want you to explore and play around with what works well but if you're ever in doubt keep in mind solutions you can use something like an output protection wrapper around a basic LACMA cell or even a basic gated recurrent unit cell. So if you're in doubt of what combinations you should use just try out an al production wrapper around the basic elist ham cell with a rectified linear unit activation function. Then you get past those cells into this TFT and dynamic Arnon and along it for first placeholder acts just like within the previous lectures you're going to create the mean square error loss function. Use it to minimize the atom optimizer then you're going to initialize global variables. Create an instance of T.F. thought train Savir seek and save your actual model. Then you're going to run a session that trains it. And hopefully once you then training it where are you going to do it predict the future. So remember that Tesa is the last 12 months of original data that your model has never seen before. So you going to perform a generation session that's going to allow you to try to predict those last 12 months of data and eventually you should end up doing after following these instructions is get something that looks like this milk production. That's the real stuff. And then the generated values. So you should get something that's pretty well aligned with the actual milk production. You can see we are off but the general behavior and trends seem to be pretty darn close. OK. So this is a pretty challenging exercise. Always feel free to jump to the solutions lecture in case you get stuck anywhere. And like I mention play around the units player on the training steps play around the learning rate. That's really half of the fun here. OK. Thanks everyone and I'll see you at the next lecture where we go through the solutions.