Automated Continuous Integration (CI) and Continuous Deployment (CD) is a critical part of DevOps and is a skill that is in high demand.
In this course, we will examine the subject in complete depth by walking through an example project and building a complex CI/CD pipeline on AWS. You will gain the following five extremely valuable and highly sought after skills:
The tech landscape today is extremely competitive and is moving at an incredibly fast pace. With the emergence of cloud-based infrastructure, many startups are disrupting long-established businesses, industries and sectors almost on a daily basis as their entry barriers and costs keep on going down. DevOps and continuous integration / continuous deployment processes allows team to iterate often and innovate faster.
After taking this course, you will have a comprehensive understanding of continuous integration and continuous deployment processes. You will be able to confidently design a CI/CD pipeline for your own web applications. You will gain an in-depth understanding of AWS CodePipeline and AWS Elastic Beanstalk services.
With these skills, you will be able to build fully automate deployments of your web applications on Amazon’s Cloud infrastructure.
The course is very hands-on and together we will walk through an example project. We will pick a web application and deploy it on Amazon’s Cloud using AWS Elastic Beanstalk. I will then demonstrate and create a fully automated CI/CD pipeline for our web application using AWS CodePipeline. I will cover both of these AWS services in complete depth while also giving you easy to follow step-by-step instructions.
We will also cover some advance topics such as ebextensions and adding AWS Lambda functions in your AWS CodePipeline.
★ 4000+ students enrolled
★ Rated highly by students
★ 70% course is practical based
In this lecture I will give you a short background on DevOps, Continuous integration and deployment. I will also introduce myself and explain why and how CI/CD/DevOps can help you in your career.
In this lecture, I will give a high level overview of what we are going to cover in this section. We will cover some basic concepts around Continuous integration, continuous delivery and continuous deployment.
Continuous Integration means regularly and frequently merging and building changes to your software.
Continuous Delivery / Deployment means regularly releasing your software to a destination.
Continuous integration or continuous deployment pipeline means a sequence of steps a code change goes through.
What are the benefits of using Continuous Integration / Continuous Deployment processes? I will share some of my own personal experiences where I observed the benefits of using CI/CD.
What is a key challenge associated with fully automated CD pipelines?
Overview of the topics that we have covered so far.
In this lecture, I will give a high level overview of what we are going to cover in this section. We will learn to conceptualize and design a CI/CD pipeline for web applications.
Let's take a look at the anatomy of a basic web application (e.g. architecture and components involved). We need to understand the architecture of our web applications in order to design an effective CI/CD process. I have formulated a 4-step process to help you conceptualize and design a CI/CD pipeline -- we will cover 3 of those steps in this lecture.
We will talk about the fourth step i.e. how to model a pipeline. We will walk through our example and design a basic CD pipeline using the four steps.
In this lecture, we will talk about how you can manage relational database changes as part of your continuous deployment pipeline.
We will talk about different types of actions that can be performed as part of your CI/CD pipeline (stand-alone vs. deployment actions).
Monitoring is important for automated CI/CD pipelines - what are the different types of approaches you can use to monitor your application as it progresses through your CD pipeline.
What are some of the common challenges associated with fully automated CD pipelines? We will talk about two most common challenges associated with web applications that you will encounter.
In this section we will dive deeper into our example web application (built in PHP). Our web application is called 'Fuzzy Telegram' and displays a random quote of the day.
Fuzzy Telegram web application talks to the Quotes API (https://theysaidso.com/api/) and fetches the random quote of the day. It stores it in a database so that any likes and # of views can be recorded against the quote.
Most modern web applications use environment variables as a way to store configuration options (e.g. database credentials). We will use environment variables in our web application.
Another reason why we are keen to use environment variables is because Elastic Beanstalk supports this method so it is quite handy to rely on it for configuration options.
We will configure our database credentials using environment variables.
We will learn more about how / why to write database migration scripts. This enables us to automate database deployments in our CD pipelines.
We will use PHINX (PHP Database Migration tool) in our example but the concept is broadly applicable to any Database change migration script or tool that you choose going forward.
We will quickly take a look at some example unit tests in our web application.
Last step for us is to check everything into a code repository.
In this lecture, I will give a high level overview of what we are going to cover in this section. We will will now dive deeper into AWS Elastic Beanstalk and deploy our web application on EB. We will also cover advance topics such as ebextensions and application versions.
Let's start configuring our AWS Elastic Beanstalk environment to deploy the web application. I will walk you through the advance configuration options available in Elastic Beanstalk.
Continuing our discussion around advance configuration options available in Elastic Beanstalk.
A quick tour of the Elastic Beanstalk dashboard that is available after you have setup your application.
Let's see how you can manually deploy your web application on AWS Elastic Beanstalk.
Manually running database migration scripts on our EC2 instance (in our example we will run phinx migration scripts).
Introducing AWS Elastic Beanstalk Extensions (.ebextensions). A great way of customizing your EC2 / AWS environment. We will focus on how you can run custom scripts using ebextensions but their features are more broad (we won't cover all the features).
Writing our first .ebextension script. The script will run our database migration script (phinx) as part of the deployment process to Elastic Beanstalk.
Let's put everything together and create a new environment in our elastic beanstalk application. In this workshop we will create a new environment that will serve as the 'production' stage for our web application.
A quick overview of application versions in Elastic Beanstalk and why they are useful
In this lecture, I will give a high level overview of what we are going to cover in this section. We will dive deeper into AWS CodePipeline. We will setup our first CD pipeline using AWS CodePipeline.
Let's just quickly check everything in to our code repository (github). We will use this code repository in our AWS codepipeline later.
Let's create our first AWS CodePipeline. I will walk you through some of the configuration options available when creating a new pipeline in AWS CodePipeline.
We will cover AWS CodePipeline concepts such as stages, actions and transitions.
Let's take a look at how you can manually release changes in AWS CodePipeline.
We will talk about stage transitions in AWS CodePipeline and how you can disable/enable them.
We will discuss different ways you can execute your actions in AWS CodePipeline.
We will talk about the actions offered by AWS CodePipeline.
AWS CodePipeline has a notion of artifacts. In this lecture we will cover what those are and how to use them in your AWS CodePipeline.
In order for our CI/CD pipeline to be effective, we will most likely wish to run some custom script or custom logic. We will talk about the four different ways you can execute custom builds / custom scripts as part of your AWS CodePipeline.
1st method to run custom scripts in Pipeline is by executing deployment actions (e.g. ebextensions). In this lecture we will setup a new environment in Elastic Beanstalk that we will use to run our tests.
Continuing on from our previous lecture, we will write a new ebextension script to execute unit tests. We will configure this as part of our pipeline and see how our pipeline reacts to success/failures.
2nd method to run custom scripts in Pipeline is by invoking AWS Lambda functions.
We will talk about lambda functions and start our 3 part workshop to create a new lambda function that will upload our assets (CSS, JS and images) to S3. We will then use the S3 bucket to serve static content for our web application.
Part 1 of 3: In this lecture, we will configure our S3 bucket to serve static content publicly.
Part 2 of 3: In this lecture, we will setup our AWS Lambda function to upload static assets to an S3 bucket. We will also briefly talk about IAM permissions in AWS.
Part 3 of 3: In this lecture, we will take our newly created AWS Lambda function and configure it in our AWS CodePipeline.
The last two methods of running custom scripts in AWS CodePipeline is by integrating 3rd party services. We can do it either as Build or Test actions. In this lecture, I will just cover how to integrate a 3rd party 'Test' action but the same principle applies to the 'Build' action as well.
I will integrate GhostInspector (UI Testing Tool) in our AWS CodePipeline.
I am a Software Engineer by profession with over 12+ years of industry experience building complex multi-tier web applications. I have worked in big tech firms where my day job was to build highly scalable web applications and services, designed to serve millions of requests per second. I am also an AWS certified Solutions Architect.
I am joining Udemy to share my expertise as well as a few tricks-of-the-trade that I have picked up over the years. My educational background is also in Computer Science & Engineering from one of the top universities in the United Kingdom (Russell Group). So, in every course, I will share insights from my experience while balancing it with the academic background and fundamentals.
I am a strong believer in interactive mode of teaching and learning. In my courses I reinforce learning through demos, code-along sessions as well as simple yet effective practical assignments.