TensorFlow Basic Syntax

Jose Portilla
A free video tutorial from Jose Portilla
Head of Data Science, Pierian Data Inc.
4.6 instructor rating • 32 courses • 2,250,669 students

Learn more from the full course

Complete Guide to TensorFlow for Deep Learning with Python

Learn how to use Google's Deep Learning Framework - TensorFlow with Python! Solve problems with cutting edge techniques!

14:07:23 of on-demand video • Updated April 2020

  • Understand how Neural Networks Work
  • Build your own Neural Network from Scratch with Python
  • Use TensorFlow for Classification and Regression Tasks
  • Use TensorFlow for Image Classification with Convolutional Neural Networks
  • Use TensorFlow for Time Series Analysis with Recurrent Neural Networks
  • Use TensorFlow for solving Unsupervised Learning Problems with AutoEncoders
  • Learn how to conduct Reinforcement Learning with OpenAI Gym
  • Create Generative Adversarial Networks with TensorFlow
  • Become a Deep Learning Guru!
English [Auto] Welcome everyone to this lecture on tents or flow basic syntax here we're going to learn the very basics of tensor flow. We'll start off by actually creating tensors just constant tensors and then we'll go into computations and then running a session intenser flow. Let's open up a Jupiter notebook and get started. All right. First thing we're going to do is import tensor flow. We're already pretty far into the course but now is the very first time we actually get to use tensor flow. And just to make sure you're using the same version. I am in the environment file. Go ahead and run this line right here. Print T.F. underscore underscore version underscore underscore. And it should have some variation of 1.3. So it doesn't matter if it says point 0 here at the end of that mixture using terms flow 1.3. Future versions like 1.4 and 1.5. They may have very small slight syntax changes. So since we are just learning tensor for and I don't want you to get hung up on small syntax changes go ahead and make sure using 1.3. Then once you fully understand tensor flow you can easily go on to a more updated version. In case you're watching this in the future let's start off by actually creating a tensor. So the word tensor it's basically just a fancy word for an end dimensional array. We'll start off by creating the most basic tensors possible and that is a constant. So I'm going to create a variable called hello and we'll say T.F. constant. I'm going to pass in a string here. We'll say hello and then I can actually have a space at the end and then I'm going to create another constant here. We'll call it world. It's also going to be T.F. dot constant and this will be as you may have guessed the string world. So if I take a look at what type of object this is it is not a String object. It is a tensor flow Python 3 ops. And then we have tensor. So this by itself is very Laury. Here is a tensor object. So as you may have guessed if I try to print the variable Hello I am not going to get a string. Instead it's going to say hey this is a tensor. It's a constant blah blah the data type inside of this tensor is a string. It's not actually going to print out the word hello in order to actually get hello to print. What we need to do is run this sort of operation inside of a session just like we did in our previous manual neural network. So the way we actually create a tensor flow session is with the following command we say with Ts dot session open and close parentheses as s e s s colon. And then you can have a block of code here indented and everything here inside of this or is just going to be. Flow operations that you run. And the reason we use this keyword with is because that makes sure that we don't have to actually close the session. So this kind of automatically opens. It runs the block of code and then closes the session. Let's go ahead and do a simple run command so say s e s s that run. And then I can create an operation here. So let's do a concatenation operation basically Hello Plus world. So we're going to run that. And since I actually didn't save it as a result let's run this again. But assign it to result and then outside of the session I can then print the result and it says hello world. And if you're wondering what this be represents right in front of the string it just represents in Python 3 that this is a bytes literal. OK. And for our purposes we don't really need to concern ourselves too much with this be continuing on. Let's go ahead and explore the more basics of tensor flow. Let's perform another computation. Let's do something like addition. So I'm going to say is equal to C-f constants and I'm going to put a number here like 10. We're going to create another constant T.F. constants. Let's put 20 and then again if I checked the type of a it again is just a tensor. And if we do something like a plus b the result right now it says hey this is t T.F. tensor add shape data type integer 32. If I run this again a plus b. Notice here that it's saying it's ADD underscore 3 which means tensor flow is actually somehow in the background keeping track of this so it's actually numbering add 2 and 3. Or to copy this and run it again. It kind of keeps track of how many times are asking for this. Now keep in mind it hasn't actually executed these tasks because we didn't run them inside of a session. So it's actually run them instead of a sessional say with T.F. session SPSS say results is equal to session run. And then we can actually put the operation here a plus b then if I check out my results it's 30 10 plus 20 is 30. OK so those are very basic computations. So let's go in and show you some more operations and these operations that are going to cover they're really more in line with kind of the tensor flow version of numpad operations. Remember with non-pay we were creating matrices like zeros ones random normal distributions random uniform distributions. So I'm going to create just a bunch of operations here that we can check out we're going to create a constant again. So we have a constant operation that's just for a constant number. Sometimes you need to have a matrix filled out so you say we'll see Phil Matt and then go and say yes fill and if you do shift enter here it says hey this is going to create a tensor remember that's just a fancy word for it in dimensional array filled for scalar value and then we're going to provide it with what it wants. It wants the dimensions and the value filled with. So we'll say hey give me a four by four filled with the value 10. So that's her filled matrix. Then we can say something like my zeros and then we have T.F. zeros that's another kind of quick operation tent's flow gives you. And again just creates a tensor of all elements set to zero. So let's give it the shape we'll again ask for a four by four. We're going to do the same thing for once as you may guess. T.F. ones and say Go ahead keep it four by four and I'll show you just a few random distributions that you can do. So there's a random normal distribution we'll call it my rant. And keep in mind everything on the left hand side of that equal sign is the variable name. And then we're going to say T.F. random and as we begin to type random you can see there's a ton of options here. Look spore the options as we need them throughout of course but ran them normal. That's kind of a more common one. So it just outputs random values from a normal distribution and you can actually provide the mean and standard deviation as well as the shape. So it's going to do that we'll just say we've been doing four by four for everything so let's continue that trend and actually just keep the defaults. But in case you want to specify it could say like I mean is equal to zero standard deviation. I forget what the default was. I think that's one point zero. You can obviously change that as you see fit and then a uniform distribution is also a very common distribution to be using. So I'll say at random and go if random uniform and it's the same thing as four by four. And for a random uniform instead of having a minor standard deviation it wants a minimum value and maximum value where you basically draw from that distribution from 0 to the max value or you know if you want a negative minimum value that's OK too. And it draws them in a uniform manner so a value is zero. Now I will say Max value is 1. OK so we have a bunch of operations here. None of these have really been executed yet. So if you just call for one of them like zeros you don't see anything it just says hey this is a tensor flow it's kind of just waiting for you to execute it or run it in a session. So I'm going to create a list here called my ops which is going to be full of these operations so I'll say whatever my cost was and say Phil Matt just using tabs or complete this quickly my zeros my ones my array and then and then my Ranz you. OK so now we have a list of all of these. So let's go ahead and run these inside of a session. So usually we're always going to be using this with T.F. session. That's pretty much how you always see in the documentation. But I do want to introduce you to something called an interactive session. It's pretty useful for notebook settings like this Jupiter notebook. It doesn't really have much use outside of a notebook setting the pinning on how you actually are coding tense or flow in whatever idea you're using. But basically if you use an interactive session it allows you to constantly call it throw out multiple cells. Let me show you how to do that. We really won't be using it throughout the course but in case you're interested in it now is a good time to introduce it to you. So you just say S E S S is equal to an interactive session and then basically the rest of these cells are going to kind of pretend that they're already being called with this with T.F. session. Again this interactive session really only useful for the notebook setting. So I'm going say for operation in that list my ops I'm going to say session run and then we'll say up let's actually print this out so we can see the results. Run that. And here we can see all the results. Let's add a new line in between them. So new line in between each result. And here we have it. So I can see that constant I can see that Phil matrix member was a four by four of tens. My zeros matrix ones matrix. And then my two ran the matrices. So again the reason I was able to do this outside of the actual session was because I had this interactive session. It's really useful for Egypt or know book environment but to kind of stick with the actual documentation and all the other examples you see online will pretty much always be using this with T.F. session unless it's a really quick job that I want to run between multiple cells. OK. So we just have sess run up something to note is that a lot of these operations they have an evil method on them. So we may see that in the future where instead of saying session that run and then you pasand the operation. Usually if you put it up and then start calling evil there's an evaluation method which is essentially telling it hey evaluate this operation and you get the exact same results when you run that. OK. So again typically we'll be saying session that run instead of calling them evil but kind of for something quick and dirty we may do an interactive session just do the evil. All right. Continuing on lesing I want to talk about matrix multiplication. We use matrix multiplication a lot. With neural networks especially our basic neural networks. So let's create a matrix Schalkwyk will have it be a constant. And we're going to feed this in as a kind of nested list. So I'll say one by two here comma and then let's go ahead and say three four is actually two by two Matrix but has one two on the top row three four on the bottom row just a nested list here. And then if I say a I can call get shape off of this and it says that the shape of this tensor is two by two which makes sense. So we provided there let's go ahead and give one more constant we'll say this constant is going to be let's have it be a 2 by 1. So I have the first number be 10 seconds to be 100. And this is where you may have to kind of refresh and linear algebra after we do this multiplication. But essentially we get the shape. This is a two by one. So I'm going to say my result is equal to T.F.. Matt mole. Hopefully it looks a little familiar to you based off our basic neural network when we implemented it. So I have my results here and then I can say sess thought run result in it gives me back the actual array. So it multiplied this two by two array by these two by one. And as a result you get back to buy one. Now keep in mind usually you'd have to run this within a session. It's only because I called this interactive session that I'm able to run it between multiple cells. Pretty useful for you notebook. Not super useful anywhere else. OK. And one last reminder is I could've just said evil to see the results as well. That's the very basics of tensor flow syntax. I really hope that kind of felt pretty familiar especially after our manual neural that work. And you can see here tons of flow framework doing a lot of the heavy lifting behind the scenes for you main things you should have gotten out of this lecture is that you can create basic constants operations and then run them within a session. Thanks everyone and I'll see you at the next lecture.