TensorFlow Basic Syntax
A free video tutorial from Jose Portilla
Head of Data Science, Pierian Data Inc.
4.6 instructor rating • 34 courses • 2,476,998 students
Learn more from the full courseComplete Guide to TensorFlow for Deep Learning with Python
Learn how to use Google's Deep Learning Framework - TensorFlow with Python! Solve problems with cutting edge techniques!
14:07:23 of on-demand video • Updated April 2020
- Understand how Neural Networks Work
- Build your own Neural Network from Scratch with Python
- Use TensorFlow for Classification and Regression Tasks
- Use TensorFlow for Image Classification with Convolutional Neural Networks
- Use TensorFlow for Time Series Analysis with Recurrent Neural Networks
- Use TensorFlow for solving Unsupervised Learning Problems with AutoEncoders
- Learn how to conduct Reinforcement Learning with OpenAI Gym
- Create Generative Adversarial Networks with TensorFlow
- Become a Deep Learning Guru!
English [Auto] Welcome, everyone, to this lecture on Tenzer flow basic syntax here, we're going to learn the very basics of tensor flow. We'll start off by actually creating sensors, just constant sensors, and then we'll go into computations and then running a session at sensor flow. Let's open up a Jupiter notebook and get started. All right. First thing we're going to do is import tensor flow. We're already pretty far into the course, but now is a very first time we actually get to use tensor flow. And just to make sure using the same version I am in the environment file, go ahead and run this line right here, print, TFR, underscore, underscore, version, underscore, underscore. And it should have some variation of one point three. So it doesn't matter if it says point zero here at the end, but make sure using test flow one point three future versions like one point four and one point five, they may have very small, slight syntax changes. So since we are just learning tensor flow and I don't want you to get hung up on small syntax changes, go ahead and make sure using one point three, then once you fully understand tensor flow, you can easily go on to a more updated version. In case you're watching this in the future, let's start off by actually creating a tensor. So the word tensor, it's basically just a fancy word for an N dimensional array. We'll start off by creating the most basic Tensas possible, and that is a constant. So I'm going to create a variable called Hello. And we'll STF Constance and I'm going to pass in a string here, will say hello and then I can actually have a space at the end and then I'm going to create another Constance here. We'll call it world. It's also going to be constant and this will be, as you may have guessed, the string world. So if I take a look at what type of object this is, it is not a string object. It is a tensor flow profonde, three mark ups, and then we have Tenzer. So this by itself is variable. Right here is a Tenzer object. So as you may have guessed, if I try to print the variable hello, I am not going to get a string. Instead, it's going to say, hey, this is a Tenzer, it's a constant blah blah. The data type inside of this tensor is a string. It's not actually going to print out the word hello in order to actually get hello to print. What we need to do is run this sort of operation inside of a session, just like we did in our previous manual neural network. So the way we actually create a tensor flow session is with the following command. We say with T's dot. Session. Open and close parentheses, as s e. S s colon, and then you can have a block of code here indented and everything here inside of this or is just going to be tense flow operations that you run. And the reason we use this key word width is because that makes sure that we don't have to actually close the session. So that's kind of automatically opens. It runs the block of code and then closes the session. Let's go ahead and do a simple run commands or say, S. S dot run and then I can create an operation here. So let's do a concatenation operation, basically. Hello, plus world. So we're going to run that. And since I actually didn't save it as a result, let's run this again. But assign it to result. And then outside of the session, I can then print the results and it says, Hello World. And if you're wondering what this bee represents right in front of the string, it just represents in Python three, that this is a bitts literal, OK, and for our purposes, we don't really need to concern ourselves too much for this bee continuing on. Let's go ahead and explore the more basics of tensor flow. Let's perform another computation, let's do something like addition, so I'm going to say A is equal to T.F. Constants and I'm going to put a number here like ten. We're going to create another constant of constants. Let's put 20. And then again, if I check the type of a, it again is just the tensor. And if we do something like A plus B, the result right now it says, hey, this is Tenzer, add shape data type integer. Thirty two. If I run this again, A plus B notice here that it's saying it's add underscore three, which means tensor flow is actually somehow in the background keeping track of this. So it's actually numbering add to add three. If I were to copy this and run it again, it kind of keeps track of how many times are asking for this. Now keep in mind it hasn't actually executed these tasks because we didn't run them inside of a session. So let's actually run them instead of a session, will say with that session. As Cyesis. We'll say results is equal to session run and then we can actually input the operation here A plus B, then if I check out my results, it's 30 percent plus 20 is 30. OK, so those are very basic computations. So let's go ahead and show you some more operations. And these operations that I'm going to cover, they're really more in line with kind of the tensor flow version of numpty operations. Remember, with Numpty, we were creating matrices like zeros, ones, random normal distributions, random uniform distributions. So I'm going to create just a bunch of operations here that we can check out. I'm going to create a constant again. So we have a constant operation that's just for a constant number, sometimes you need to have a matrix filled out. So you say we'll see Phil, Matt, and then I'm going to say, WTF, Phil? And if you do shift, enter here. It says, hey, this is going to create a Tenzer. Remember, that's just a fancy word for an N dimensional array filled with a scalar value. And then we're going to provide it with what it wants. It wants the dimensions and the value it filled with. So we'll say, hey, give me a four by four filled with the value ten. So that's our field matrix. Then we can say something like my zeros and then we have 12 zeros. That's another kind of quick operation since Flow gives you and again just creates a tensor with all elements to zero. So let's give it the shape. We'll again ask for a four by four. We're going to do the same thing for once, as you may have guessed once. And let's go ahead. Keep it four by four and I'll show you just a few random distributions that you can do. So there's a random normal distribution. We'll call it Myranda. And keep in mind, everything on the left hand side of that equals minus the variable name. And then we're going to say T.F. Random. And as we begin to type random, you can see there's a ton of options here. We'll explore the options as we need them throughout the course. But random, normal, that's kind of a more common one. So just outputs random values from a normal distribution and you can actually provide the mean and standard deviation as well as the shape. So let's go in and do that. We'll just say we've been doing four by four for everything. So let's continue of that trend and we'll actually just keep the defaults. But in case you want to specify, you could say like mean is equal to zero standard deviation. I forget what the default was. I think it was one point zero. You can obviously change that as you see fit. And then a uniform distribution is also a very common distribution to be using. So we'll say at random. And so if random uniform. And let's do the same thing here, four by four and for a random uniform, instead of having a minor deviation, it wants a minimum value and maximum value where you basically draw from that distribution from zero to the max value. Or, you know, if you want a negative minimum value, that's OK, too. And it draws them in a uniform manner. So say main value is zero and we'll say max value is one. OK, so we have a bunch of operations here. None of these have really been executed yet. So if you just call for one of them, like my zeros, you don't see anything. It just says, hey, this is a tense flow. It's kind of just waiting for you to execute it or run it in a session. So I'm going to create a list here called My Ops, which is going to be full of these operations, will say whatever my concern was. And it's a film that just using tabs are complete this quickly. My Xeros my ones, my and then and then my Ran's you. OK, so now I have a list of all of these, so let's go ahead and run these inside of a session. So usually we're always going to be using this with T.F. Session. That's pretty much how you always see in the documentation. But I do want to introduce you to something called an interactive session. It's pretty useful for notebook settings like this Jupiter notebook. It doesn't really have much use outside of a notebook setting, depending on how you actually are coding tensor flow and whatever it is you're using. But basically, if you use an interactive session, it allows you to constantly call it throughout multiple cells. Let me show you how to do that. We really won't be using it throughout the course. But in case you're interested in it, now's a good time to introduce it to you. So you just say SPSS is equal to an interactive session and then basically the rest of these cells are going to kind of pretend that they're already being called with this with T.F. session. Again, this interactive session really only useful for a notebook setting. So I'm going to say for operation in that list, my ops. I'm going to say session run and then we'll say up. And let's actually print this out so we can see the results. Run that. And here we can see all the results, let's add a new line in between them. So new line in between each result. And here we have it, so I can see that constant, I can see that filled matrix, remember, was a four by four of tens, my Xeros matrix, my one's matrix, and then my two random matrices. So, again, the reason I was able to do this outside of the actual session was because I had this interactive session. It's really useful for a Jupiter notebook environment. But to kind of stick with the actual documentation and all the other examples you see online will pretty much always be using this with T.F. session, unless it's a really quick job that I want to run between multiple cells. OK, so we just have this run up. Something to note is that a lot of these operations, they have an evil method on them. So we may see that in the future where instead of saying session that run and then you pass on the operation, usually if you put it up and then start calling evil, there's an evaluation method which is essentially telling it, hey, evaluate this operation and you get the exact same results when you run that. OK, so again, typically we'll be saying session that run instead of calling this eval. But kind of for something quick and dirty, we may do an interactive session. Just do the eval. All right, continuing on, last thing I want to talk about is matrix multiplication, we use matrix multiplication a lot with neural networks, especially our basic neural networks. So let's create. A matrix real quick, we'll have it be a constant and we're going to feed the as a kind of nested list, so say one by two here, comma, and then let's go ahead and say three, four. So there's actually a two by two matrix, but as one, two on the top row, three, four in the bottom row, just a nested list here. And then if I say a. I can call get shape off of this, and it says that the shape of the sensor is two by two, which makes sense. So we provided there. Let's go ahead and give one more constant will say this constant is going to be let's have it be a two by one. So we'll have the first number be ten second, maybe one hundred. And this is where you may have to kind of refresh on a linear algebra after we do this multiplication. But essentially we get the shape. This one's a two by one. So I'm going to say my result is equal to T.F.. That Matt Mohl, hopefully it looks a little familiar to you based off our basic neural network when we implemented it. So I have my results here and then I can say, yes, that run results and it gives me back the actual array. So it multiplied this two by two array, by this two by one. And as a result, you get back a two by one. Now, keep in mind, usually you'd have to run this within a session. It's only because I called this interactive session that I'm able to run it between multiple cells. Pretty useful for you, but a notebook not super useful anywhere else. OK, and one last reminder is I could have just said eval to see the results as well. That's the very basics of tenso flow syntax, I really hope that kind of felt pretty familiar, especially after our manual neural network. And you can see here Tenzer Flow Framework doing a lot of the heavy lifting behind the scenes for you. Main things you should have gotten out of this lecture is that you can create basic constants operations and then run them within a session. Thanks, everyone, and I'll see you at the next lecture.