- 5 hours on-demand video
- 9 articles
- 29 downloadable resources
- Full lifetime access
- Access on mobile and TV
- Certificate of Completion
Get your team access to 4,000+ top Udemy courses anytime, anywhere.Try Udemy for Business
- What is a pipeline
- What is Continuous Integration (CI), Continuous Delivery (CD) and Continuous Deployment (CD)
- Automate your build, test & deployment with Gitlab CI
- Learn industry "best practices" in building CI/CD pipelines
- Demonstrate your understanding of building CI/CD pipelines to future employers
- Automate your builds, tests, and deployments
- Automatic deployments using AWS
- Build pipelines with code quality checks, unit tests, API testing
- Solve problems with hands-on assignments
- Create Merge Requests and review code
- Dynamic environments
This lecture aims to give you an understanding what pipelines are and how they can be built into Gitlab CI following a very simple example. This will use two simple stages and define two jobs assigned to each stage in Gitlab CI.
- YAML is just a data serialization language which allows us to store different things. In our case, we use YAML to define a pipeline but YAML can be used for many other things (examples ???)
- to put it very simple, YAML can be used to define key - value pairs. For example name: John. But YAML can also store lists or objects.
- this is exactly what we have done so far, but without going too much into the details of what everything is
- while it may not look like, YAML is actually compatible to another format called JSON
- name: John
- comments can be created with hashtags # foo
In this section, we increase the complexity of the application we are building and deploying. We will be building and testing a Java application, and we will deploy it to Amazon Web Services (AWS). I understand if you are not a Java developer or are not interested in using Java or AWS (Amazon Web Services). This section tries to provide you with a realistic example. I can assure it is totally fine just to watch and understand the underlying concepts which can apply to any other technologies and services.
The pipeline we will be building is more complex, but the same GitLab CI principles will be used.
In this process, you will learn:
- build an advanced CI pipeline with code-style checks, unit tests, API tests, performance tests & security checks
- publish test results (both in HTML and XML format)
- learn about cloud services and how to use AWS for deployments
This means you will be exposed to new tools and technologies that you may not be familiar with. Overall the complexity of this project is much higher than before. As with many new things, a bit of patience is required when things do not work as expected.
As this course is focused on building pipelines with GitLab CI, I cannot give you a full introduction to all the tools and technologies used. I will provide you with links to articles and other video tutorials.
If you want to follow along, that is fantastic. As always, I am here in case you need help or get stuck.
Please note that some lectures may have a Troubleshooting document in the resources folder and can help you fix some common issues.
Are you just as excited to get started? Let's go!
This is a simple Java application that represents a simple car fleet management solution. The tool that you see here is an IDE called IntelliJ. I understand that if you are not familiar with IntelliJ, getting this to run on your computer may be challenging. If you can't get it to work, don't worry. This is nice to have but not needed to build the pipeline. Also, check the Resources for some tutorials that can help you get started.
I have already done the programming work, but together we will be building the CI/CD pipeline.
Feel free to clone this repository so that we all have a common basis to get started.
If you won't want to install and run the application locally, no problem. You can work on the .gitlab-ci.yml file without any issues.
What this application does is to expose an API that allows you to add, view, and remove cars from a database.
An API is a program that does not have a graphical interface, like a website, for example. But that API can use used by a front-end application to display the data in a browser.
We are ready to start building the pipeline for this project and the first step is the CI pipeline.
If you remember, the CI pipeline typically has a few stages: build, code quality, test, packaging the application for later use.
The purpose of the CI pipeline is to ensure that the artifact that we are building corresponds to our quality criteria and is releasable.
Let's start building the CI pipeline with the build stage. Even if you are not a Java developer or have no relation to Java, most programs go through a build stage.
The build process will take the source code and transform it into something that can be executed on a computer. We call this process compilation. In this case, the build process will translate source code into Java bytecode that can be executed on the Java Virtual Machine (JVM). The output is a jar file (which is an archive) that contains this code.
To run the build process locally, I will use a tool called Gradle, which is just a build tool.
Now we have an artifact (or a package of software) and we are ready for deployment. There are two opposite directions in which we can go: deploy on your own infrastructure (aka server that we control and manage also called an in-house server) or deploy using a cloud provider, like Amazon AWS, Google Cloud, Microsoft Azure, and many others.
The advantage of using a cloud provider is that you only rent the infrastructure for the time you are using it. Using a cloud provider, you can focus on actually building and maintaining the application and forget about the hardware and scalability issues.
Which option makes more sense, it is up to you. For some of the reasons mentioned above, cloud services have risen greatly in popularity in the last years.
The following lectures will show to use Amazon AWS to deploy a Java application.
Amazon Web Services or simply AWS is a cloud platform offering over 170 services available in data centers located all over the world. Such services include virtual servers, managed databases, file storage, content delivery, and many others.
While this section focused on AWS, the principles presented here largely apply to any other providers.
This lecture discusses:
- how to create a new account
- how to to setup billing for AWS
This lecture contains a short Introduction to the serverless architecture and AWS Elastic Beanstalk. Even if you use a cloud provider like AWS, you can still rent a virtual machine that has a dedicated CPU, memory and disk.
If you use a virtual server, this means that you still need to handle software updates, back-ups, monitoring, and any other aspects that ensure your application is running as expected.
AWS Elastic Beanstalk is a way to deploy an application but let AWS handle the hardware and software needed to run it. It is probably one of the easiest ways to deploy an application in the cloud.
We need to ensure that the right application version was deployed. The way to approach this is to look at the info endpoint which will tell us the current application version. Having this check is mandatory to avoid any confusion regarding what was deployed.
Most projects want to have a consistent code style and to follow some conventions and best practices. Often automated tools are used to assist with this process.
These tools typically do static code analysis, as the inspection performed without actually running the code. This approach is in contrast with a dynamic code analysis, which will actually run the code in order to perform the inspection.
One simple one that can be used for Java projects is PMD. PMD can help find unused variables, problematic code blocks and overall to enforce generally accepted best practices.
PMD already has a large set of predefined rules but will also allow you to configure or add new rules, as needed.
In this lecture, we quickly go over that a unit test is. In a nutshell, unit tests are responsible for testing only single units of code, typically one class. The execution time is very fast and gives instant feedback.
For Java projects, JUnit is the most popular framework for writing unit tests.
GitLab Pages is a great addition to any CI/CD pipeline. GitLab Pages allows you to publish HTML websites directly from a repository. With some HTML and CSS skills, this great option for creating dashboards. This also allows you to publish HTML reports.
- Basic experience with Linux, Linux commands and using the terminal
- Know how to work with Git (basics like configuring a repository locally, cloning, merge, commit, push)
- Admin permissions that allow you to install additional tools (Node, npm, Docker, Virtualbox)
- Optional: some basic experience with Docker will be a bonus
This course will teach you how to use Gitlab CI for your own projects. You will learn the basics of CI/CD and start building pipelines right from the first lecture.
- have an overview of the Gitlab architecture
- create a simple pipeline
- learn the CI/CD practice by deploying a simple website
- use Docker images within Gitlab
This course will NOT make you a GitLab CI / DevOps expert
A lot of courses promise you will become an expert. Becoming an expert in any tool comes with time and hard work. It simply does not make sense to promise something like that. It will not be honest.
What I will try is to explain to you the basics and offer you enough practice opportunities so that you can apply what you learn easily in your own projects as well. I will show you how to build pipelines with Gitlab CI.
- Software developers learning to build pipelines in order to test & deploy code
- IT Professionals: Developers, Software Engineers, Application Architects, Infrastructure Architects, and Operations