How does a computer work?

Evan Kimbrell
A free video tutorial from Evan Kimbrell
Founder of Sprintkick | Ex-VC | Ex-startup founder
4.4 instructor rating • 35 courses • 621,323 students

Learn more from the full course

Pre-Programming: Everything you need to know before you code

Increase your chance of success learning to code and communicating with other developers

06:16:07 of on-demand video • Updated July 2021

  • Better understand the fundamentals of how programming works
  • Understand the fundamentals of how computers work and how that relates to modern web technology
  • Choose what programming language and path they want to pursue in their career
  • Understand and apply the 8 basic concepts of programming
  • Evaluate, install, and modify any content management system
  • Understand world technology trends like responsive design, pair programming, PaaS systems, and the growth of APIs
  • Make a decision about what technology and ecosystem interests you
  • Correctly understand and apply the concept of a programming framework
  • Call out your friends for not knowing the difference between a framework, library, and IDE (they'll love you)
  • Communicate with others about technology in a way that doesn't immediately give away your inexperience
  • Impress your friends during drinks with random factoids about Bill Gates & Steve Jobs
  • Finally understand the reason Comcast keeps billing you $29.99
English Hey guys, welcome back to the course. Ok, so at this point, what do we know? We know that computers are, in their essence, basically just these machines that are designed to solve problems, but we often forget that without humans to operate them, they're actually just dumb boxes of metal. I know they call it a smartphone, or a smart watch, or even a smart fridge, but without someone actually driving them, giving them input information, they don't actually do anything but sit there and rust. They need your information and they need instructions in order to compute and to give you something of value, and that's at least, of course, until computers get smart enough to learn how to compute themselves, in which case... God save us. I think that's actually the plot to Terminator. So that's something to keep in mind as you dive deeper into computer technology and into computer programming. Through whatever program or application you're using, you're literally giving a computer X information and telling it to do Y with that information. Your information trickles down to the core components of a computer and says, "Do this," and the computer says, "All right, I'll do it, lay off." It's just a thing essentially waiting for you to give commands to it to do something, which brings us to the next question which is worth asking. How does a computer actually do what you just bossed it to do? What is going on inside of that stupid box on your desk? Well, the first thing we need to understand is that every time you ask your computer to process some instruction, let's say I was typing "Evan Kimbrell" into Google, which is something I often do laying in bed at night, or let's say you're double-clicking the Minesweeper icon to play that game with mines, you are communicating indirectly with what's called the "central processing unit" of your computer, also known as the CPU. The CPU is the core of the computer that executes instructions at the absolute lowest level. A computer processor is where the problems actually get solved. How does a CPU actually work? No one knows. Not even Bill Gates, who once said, and I quote, "The CPU is that thing that makes your computer do stuff." Well said, Bill. We're not really going to get deep into how a CPU works, because it's actually an immensely complicated process and knowing that process is not really going to benefit you in any way to become a computer programmer or just to better understand programming. If you insisted, I'll give you a somewhat helpful explanation, just don't quote me on it later. CPUs, if you haven't seen them before, if you ever opened up your computer, it's those tiny little silicon boxes, little squares, and they're usually attached to what's called the motherboard - it's that big green board with a bunch of wires and little nodes. These little squares are so complex that there are actually only two major companies in the world that can make them for personal computers. The first one is known as Intel and the second one is known as AMD. Within this tiny little square of silicon, there are thousands and thousands of what are called transistors that are "microns" in diameter. Yes, I know your face just glazed over when I said micro-gadgets and transisto-whatevers, but a transistor is actually a lot easier to think of. A transistor is actually just a wire that can transfer electricity, that is it. And a micron is an extremely small unit of measurement. How small? Well, a piece of human hair is about 100 microns in diameter, and the world's smallest CPU transistor is about 6 microns. Actually, I'm joking, it was 6 microns back in 1974. Now they're about 0.05 of 1 micron. I know, it's insanity, and it's really hard to comprehend. Just know that they're extremely complex. What CPUs essentially do is they take electricity and commands, and then through a very strange process of alternating transistors, those tiny little things that transfer information, it can actually process information. I know, this is all jibberish I'm telling you, but think of it kind of like this. You have a calculator. A processor is like a calculator in its own way, except just smaller. It transfers electricity in circles, which we actually call a "clock," and it's measured in what we call MIPS, or millions of instructions per second. If you guys have ever, if you know anything about computers, ever heard the term "overclocked" or "underclocked"? That is a way of speeding up your processor or slowing it down. Now, a processor can add, it can subtract, it can multiply, it can do really whatever needs to be done depending on whatever you send it. If a processor were Sesame Street, it'd probably be the Cookie Monster, and this is the part where he gobbles up all the cookies and then screams out the alphabet or something. So that is roughly how calculations and information get processed. Now, of course, most times when you're working with a computer, you'll want to save whatever you're doing, so it doesn't disappear forever. Now, that is why computers come with another crucial component we call memory. Memory is essentially the way that computers keep information on file without it disappearing. Computers have permanent memory built right into it as the form of what's called a hard drive, which means you can save something, turn the computer off, and that information will actually stay there, and when you turn it back on, it will come back. Hard drivers work very similar to the way that cassettes or CDs work. Again, do you need to know how that works, even in one deeper level? No. Computer hard drives most often consist of a circular magnetic plate, hilariously called a platter, and it spins like a CD, and has information written on it, and removed by that read-write head, which is actually called an actuator arm, so it hits it constantly when it needs to read the information or change the information. Yeah, I know, it's not super useful, but just think of it like an old Sony Walkman. You can store information on CDs, and when you put them in your walkman they spin, and then the information gets read off of them. They're just like that, but full of a lot more information. Think of it as the new and improved Cookie Monster who is able to control himself and, instead of immediately devouring all the cookies you give him, he stows some away in the pantry in case you ask for them back. But what about information that you need to store but you don't necessarily need to store forever? Well, the computer handles that too and it's through a process called RAM, which stands for Random Access Memory. RAM, Random Access Memory, is used to quickly store and access information that doesn't necessarily have to be remembered forever, just for the duration of whatever program you're running. Let's say, for example, you open up your web browser, you go to a website, as you often do. The images, the text, the animations, and whatever else is in the browser, it needs to be displayed at this moment, and in order to do that, it has to be stored somewhere temporarily. Where does it gets stored? It gets stored in your RAM. So as soon as you exit out of that program, your computer says, "We don't need these images and this text anymore," and it leaves your RAM, and frees it up so you can use other applications. So have you ever had one of your computers get really, really slow when you have too many tabs open on your Google Chrome? Well, that's obviously because you're running out of RAM, and the system has to wait for RAM to free up before processing or loading the next thing. That's why your computer freezes and then comes undone later. So when someone says, "Oh wow, Firefox is such a light browser," whether it's true or not, what they mean is that it doesn't have a lot of unnecessary processes, and because of that, when you run it, it this takes up less RAM. What that means at the end of the day is that you can have more browsers open, and if you have commitment issues, you can decide to just keep and hoard all of your tabs, like I do. As a programmer, your basic function is that you're going to be giving the computer instructions, and you know how to do it in a way that the computer can understand. Back in the day, the way that programmers actually sent instructions to a computer, it happened the "hands-on" way. So programmers would literally shove these giant stacks of cards into giant machines that were actually the size of rooms, and the patterns of the holes punched onto the card would tell the machine what to do. So punch cards were actually the original programs. Later on, when the British mathematician Benedict Cumberbatch made his breakthrough discovery of the "stored program" concept, programmers could suddenly design machines that remembered programs. No more need to shove punch cards into them all the time. Now, 80 years later, humans still can't remember anything, and computers can remember everything, while being small enough to fit in your pocket. So, there you go. You put a CPU (a central processing unit) and two different types of memory together, and you have the building blocks for a good problem-solving machine, a machine that needs a human to give it instructions, or else it's just an expensive paperweight. The next lecture, we're going to talk about how computers get the information they need, and how they shoot it out, so we can use it. All right, see you in the next lecture!