- 6.5 hours on-demand video
- 3 articles
- 16 downloadable resources
- Full lifetime access
- Access on mobile and TV
- Certificate of Completion
Get your team access to 4,000+ top Udemy courses anytime, anywhere.Try Udemy for Business
- Use the Command Line Interface to deploy AWS resources
- Setup multiple Profiles with Multiple Access Keys to quickly switch between permissions or accounts
- Automatically Sync local files to S3 at specified time.
- Create and invoke Lambda Functions using the command line
- Build a VPC using the command line
- Deploy Cloudformation templates to create Stacks with AWS CLI
- Deploy EC2 instances and create a custom dashboard to view running instances on the command line.
- Create IAM users, access keys , roles using the command line
- AWS Account
- Experience with AWS
----- Recent Updates------
- AWS S3 Server Side Encryption lessons added. This included SSE-S3, SSE-KMS and SSE-C( not available via the AWS console)
- AWS KMS key creating with the CLI
- S3 Multipart upload with the AWS CLI
- Use CLI to work with Amazon Rekognition ( for image recognition and video analysis)
About the Course:
This course is designed to help students and developers get started with using AWS Command Line Interface.(CLI). If your prior experience with AWS has solely been AWS' web console, using the CLI is a different way of using your AWS account. Using the command line interface is a critical skill for any AWS professional.
The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate your infrastructure through scripts.
You should be ready to manage and automate your AWS infrastructure using the CLI after this course. You will learn 1 more way to deploy/manage/destroy infrastructure and services on AWS. This will make your workflow much more faster and efficient.
Learning the command line will give you another perspective on the AWS services and may even shed light on concepts that you are unclear with. The AWS developer associate exam objective mentions " Ability to use the AWS service APIs, AWS Command Line Interface (CLI), and software developer kits (SDKs) to write applications" Learning the command line is one of the big aspects of the AWS DevOps Pro exam as well.
In this course , We will go over things like:
Create access keys to use with the AWS CLI
Install and setup the CLI on your local machine
Create a VPC with the CLI.
Create EC2 instance, view running instances, filter attributes.
Copy files to and from S3 buckets. Sync local folders with automated cron jobs.
Create lambda functions and invoke them using CLI
Deploy CloudFormation stacks with the CLI
After this course you can begin making calls to your AWS services from the command line like:
$aws ec2 describe-instances
$aws s3 ls
$aws s3 sync . s3://mybucketname
$ aws ec2 stop-instances --instance-ids i-123abcdefg
and more advanced things like creating Lambda functions, creating CloudFormation stacks etc.
What you will get with this course:
A catalog of videos/labs on how to use the AWS CLI
Future updates on various new topics
Ability to ask questions on the Discussion board
If you have any request for a certain topic, please share them in the discussion section.
- Those who want to use the command line to deploy infrastructure on AWS.
- Those preparing for AWS certification exams
- Those who want to automate deployment and management AWS resources.
Working through the AWS console might not always be the option or choice for many folks. You might want to bypass a lot of the clicks needed to launch an instance or upload a file to S3. Thankfully AWS has a really intuitive CLI for major, if not all services for exactly these kind of problems.
In this course, we will go through the steps to work with various AWS services like S3, EC2, VPC, Lambda, IAM, CloudFormation etc using the AWS CLI.
AWS provides various options to encrypt your data on S3.
There are 2 types of encryption:
- Client side: Client encrypts locally using tool/software of their choices
- Server Side encryption: Choose from available choices in AWS
In this video, our focus will be Server Side Encryption(SSE) since Client side is open to your preference/choices/requirement.
Server Side has 3 types of encryption as well:
- SSE-S3: One click encryption
- SSE-KMS: Using KMS
- SSE-C: Not available in console. Customer provides the keys
We will need 2 accounts for this lesson. Account A will provide access to Account B on one of A’s bucket.
On Account A — Create a new Bucket:
Add some items into the bucket.
Enter this policy text in one the bucket policy of the Account A. Change the Account ID and bucket Name:
“Sid”: “Example permissions”,
** if you remove the second line from the resource section, then you cannot copy the files inside the bucket and only list the bucket. With the second line you get access to every object inside the bucket.
From Account B’s access credentials use AWS CLI and enter this command:
$ aws s3 ls bucketname
You should be able to list the bucket and copy content from another account.
Now try adding this to the bucket policy and get finer controls:
“Sid”: “Example permissions”,
“Sid”: “Deny permission”,
CLI Reference: https://docs.aws.amazon.com/cli/latest/reference/s3/presign.html
Github repo link :https://github.com/ravsau/AWS-CLI-Commands
In this lesson, our aim is to create an Amazon Machine Image commonly referred to as an AMI that we can use to launch instances in the future.
So far, we’ve been mostly using the Amazon Linux AMI. Now we can create our own AMI’s.
AMI essentially saves the configurations of a server when the image is created so If I create an image of my web server, I can launch other web servers and have a web server running immediately after they launch.
This means that we don’t have to install a web server every time we provision a new web server. In today’s example, we will only install a web server, but you can create an AMI with any software installed, and use that to launch an EC2 instance.
Step 1: We will use the user data to launch an EC2 instance. We will be using a file that has a bootstrap script containing the commands to launch and start a web server. We will pass that file with the run-instances command.
Launch an EC2 Instance and save the instance ID into an environment variable
instance_id=$(aws ec2 run-instances --i --instance-type t2.micro --key-name MyKeyPair1 --user-data file://userdata.txt --query 'Instances[*].[InstanceId]' --output text )
Step 2: Check the user data worked and the web server is running by typing the web server IP on a browser and verifying you see “Hello World”
Step 3: Create an image from that instance ID and save the image id to a variable image_id
image_id=$(aws ec2 create-image --instance-id $instance_id --name "My server" --description "An AMI for my webserver" --query ImageId --output text)
step 4: use that image to launch an instance
aws ec2 run-instances --image-id $image_id --instance-type t2.micro --key-name MyKeyPair1 --query 'Instances[*].[InstanceId]' --output text
Step 5: verify the web server is running by typing the IP address on a browser
Thanks, and see you in the next lesson.
In this lesson I will briefly explain how Cloudformation works.
AWS CloudFormation gives developers and systems administrators an easy way to create and manage a collection of related AWS resources, provisioning and updating them in an orderly and predictable fashion
You can create your own templates to describe the AWS resources, and any associated dependencies or runtime parameters, required to run your application.
A stack is created through a template and we can update a stack or delete a stack all at once.
CloudFormation is available at no additional charge, and you pay only for the AWS resources needed to run your applications.
Cloudformation is a very useful tool to have.
You can find all the commands and files in the link below:
By default, we cannot monitor Memory metrics on EC2 Instances. But, by using a custom metric we can. In this video, I walk through the process of setting up custom metrics. This is one of the possible questions asked in the AWS assoiciate developer/architect/sysops exam as well.