ServerlessMicroservice with AWS - A Complete Guide!: 3-in-1
- 11.5 hours on-demand video
- 1 downloadable resource
- Full lifetime access
- Access on mobile and TV
- Certificate of Completion
Get your team access to 4,000+ top Udemy courses anytime, anywhere.Try Udemy for Business
- Improve the reusability, composability, and maintainability of code.
- Create a highly available serverlessmicroservice data API.
- Build, deploy and run your serverless configuration and code.
- Speed up delivery, flexibility and time to market using serverlessmicroservices.
- Add your microservices to a continuous integration & continuous delivery pipeline.
- Estimate, and reduce maintenance and running costs.
- Implement over 15 microservices architecture patterns without needing containers or EC2 instances.
- Scale up without significant changes to tooling, architecture, or development practices.
- Reduce the risk and cost of operating a cloud platform.
There are many architectures that exist before Microservices. In this video, we will see what are they, what are the benefits and drawbacks, how do they compare to microservices.
See the overview of monolithic multi-tier architecture and Service Orientated Architecture (SOA)
See the overview of micsoservices architecture
See the benefits and drawbacks of monolithic and micsoservices architecture
In this video, we will show where does serverless computing fit in with other public cloud offerings.
See the overview of virtual machines benefits and drawbacks
See the overview of container benefits and drawbacks
See the overview of serverless computing benefits and drawbacks
In this video, we will set up a new user. Also, we will see how can I ensure that the user access is secure and how I setup my local serverless environment.
Use the AWS Management Console to setup a new user
Set up multi-factor authentication for a user
Set up your local environment credentials, AWS Python SDK and CLI
In this video, we will see that with data breaches and compliance requirements security is a key consideration.
See the security Examples and impacts on an organization
Explain the security at rest, in transit, authentication, and authorization
Learn that AWS has a shared security responsibility model that should be followed
In this video, we will see how can we configure AWS security to help secure user authorization and access to resources.
Explain AWS Identity and Access Management (IAM)
Learn IAM policies, users, groups to help secure your users
Explain IAM roles to help secure your trusted entity
In this video, we will see how can I query DynamoDB from within a Lambda function.
Add the previous Lambda code that parses the given URL parameters
Add the previous code that queries DynamoDB by Partition and sort keys
Test the Lambda function with sample request data
In this video, we will see how we can make sure that the code is still running correctly and be more productive writing Lambda code.
See that testing allows better collaboration, improves product quality, and leads to shorter release cycles
Explain how unit tests help to verify that our functions work as expected
Mock to replace parts of the system under test
In this video, we will see how I know if my API is performing as expected with a low latency and how can I improve the latency.
Use Python Locust tool
Load test for concurrent users
See that the latency can be improved by changing DynamoDB and Lambda settings
In this video, we will get to know that there are many challenges of provisioning infrastructure manually which include the cost, effort, lack of repeatable processes and limited scalability.
Use Infrastructure as a Code
Learn that serverless infrastructure and resources can be deployed using Serverless Application Model (SAM)
Learn that serverless infrastructure and resources can be deployed using the AWS CLI other frameworks
In building serverless microservices how can we benefit from what is available from leading companies and open source communities.
Make use of design patterns
Use software design patterns and principles
Use and understand serverless microservice pattern categories
How do we ensure efficient communication between microservices? How are microservices created and designed?
Understand synchronous versus asynchronous communication
Understand one-to-one and many-to-many communication microservice patterns
Understand decomposition pattern by business capability
What are the serverless microservice patterns that can leverage in AWS to reduce development, maintenance, and running costs?
Understand lambda event sources types
Understand lambda stream event sources
Leverage AWS managed services and serverless offerings in each category and use case
Lambda functions are stateless, so you need to store state externally which is best practice. What is important in distributed systems and what are the best ways to store state?
Understand that CAP Theorem stands for Consistency, Availability, and Partition tolerance and is important for data storage in distributed systems
Understand the shared database and database per service patterns, and how they are used in distributed systems
Understand how these patterns can be implemented with DynamoDB, RDS, Aurora or Aurora serverless
Why and how can a lambda access an RDS instance?
Access MySQL RDS from API gateway via a lambda function
Understand the code and configuration to set it up
Ensure the lambda role has a policy that allows it to attach and detach network interfaces within a VPC
What are the main differences for a lambda to access Aurora over RDS MySQL?
Use the same code to access MySQL RDS, MySQL Aurora, and Aurora Serverless
Understand that Aurora is clustered by default
Understand the additional read replica database endpoint in Aurora
What are the main steps to ensure your RDS or Aurora databases communication is secure?
Understand RDS and Aurora encryption in transit code and configuration
Set up RDS or Aurora IAM database authentication
Use IAM database authentication to generate an authorization token rather than use a password
How can we overcome the database persistence limitation such as lack of full history on row changes, and efficient distributed synchronisation/replication challenges?
Use the event sourcing pattern where every action is an event appended to an event store
Use the Command Query Responsibility Segregation (CQRS) pattern to split write and read methods
Understand that in CQRS, commands are methods that mutates state and queries are methods that return data
How can we implement the serverless event sourcing pattern?
Have API Gateway with or without a lambda proxy write to DynamoDB
Have API Gateway with or without a lambda proxy write to Kinesis Streams
Have API Gateway with or without a lambda proxy write to SNS or SQS. The lambda the uses CloudWatch trigger
How can we implement the serverless CQRS pattern command and event processor?
Have API gateway with or without a lambda proxy write to DynamoDB. Then a lambda writes to DynamoDB and Kinesis Firehose
Have API gateway with or without a lambda proxy write to SQS/SNS. Then a lambda writes to DynamoDB and Kinesis Firehose
Have API gateway with or without a lambda proxy write to Kinesis Streams. Then a Lambda writes to DynamoDB and Kinesis Streams write to Kinesis Firehose
Why is it important to monitor your microservices and what patterns are available to do so?
1000s of microservices, governance, compliance, operational auditing, risk auditing, and so on
Use application metrics and health checks patterns
Use centralized logging, audit logging, and distributed tracing patterns
How can serverless application metrics and health check API patterns be implemented?
Use Amazon CloudWatch
Use a lambda triggered by CloudWatch to run health checks
Use a lambda to write the results of the check to CloudWatch metrics and set an Alarm that sends an email notification upon failure
What are the different ways to send logs to CloudWatch logs?
Use API Gateway built-in logging or custom logging
Use Lambda print function, native logging or external JSON logging packages
For non-lambda microservice (e.g. traditional EC2 or container-based) use CloudWatch logs agent
How can I configure Lambda, DynamoDB and other microservices to send traces to X-Ray?
Enable built-in Lambda X-Ray integration with the correct policy
For Lambda and DynamoDB in Python use aws-xray-sdk via xray_recorder begin and end, or decorators
For non-lambda services use aws-xray-daemon
How can I architect a serverless discovery and catalogue service?
For batch discovery use a lambda function that lists all resources, and persists results to DynamoDB
For near-real-time use AWS CloudTrail that sends logs to CloudWatch logs that a lambda scans, and persists results to DynamoDB
Expose the DynamoDB catalogue using API gateway via a Lambda proxy
What are the focus and differences of Continuous Integration, and Continuous Delivery, and Continuous Deployment?
Use continuous Integration to focus on automating build and testing to detect issues early
Use Continuous Delivery to focus on automated release process with a human approval step for the production release
Use Continuous Deployment to focus on a fully automated release process without a human approval step for the production release
What AWS managed services are used in a serverless CI/CD pipeline and what are they for?
Use IAM roles and polices for security and access
Use AWS CodeCommit for code version control. Use AWS CodeBuild to manage build service
Use AWS CodePipeline for continuous delivery, and as a service that helps you build a CI/CD pipeline
When is it best to use serverless computing versus containers?
Look at your non-functional requirements
Use containers for very low latency < 50ms, large number of concurrent requests > 3000, large deployments packages and fixed costs
Use serverless when you require latency > 50ms, focus on business logic code, PAYG costs, and you are already on AWS
How do you scale database and event streaming resources?
Scale DynamoDB using GSI, creating a batch API and using DAX
Scale Aurora using read-replicas, DB optimization. Runload testing with production loads and monitor performance
Scale Kinesis Streams look at the number of consumers required, and load test by replying full production volumes
- Prior experience to traditional application development is assumed.
- Basic understanding of microservices and serverless architecture will be useful.
Microservices are a popular new approach to building maintainable, scalable, cloud-based applications. AWS is the perfect platform for hosting Microservices. Recently, there has been a growing interest in Serverless computing due to the increase in developer productivity, built in auto-scaling abilities, and reduced operational costs.
Building a microservices platform using virtual machines or containers, involves a lot of initial and ongoing effort. There is a cost associated with having idle services running, maintenance of the boxes and a configuration complexity involved in scaling up and down.
In combining both microservices and serverless computing, organizations will benefit from having the servers and capacity planning managed by the cloud provider, making them much easier to deploy and run at scale.
This comprehensive 3-in-1 course is a step-by-step tutorial which is a perfect course to implementing Microservices using Serverless Computing on AWS. Build highly availableMicroservices to power applications of any size and scale. Get to grips with Microservices and overcome the limitations and challenges experienced in traditional monolithic deployments. Design a highly available and cost-efficient Microservices application using AWS. Create a system where the infrastructure, scalability, and security are managed by AWS. Finally, reduce your support, maintenance, and infrastructure costs.
Contents and Overview
This training program includes 3 complete courses, carefully chosen to give you the most comprehensive training possible.
The first course, Building Microservices on AWS, covers building highly available Microservices to power applications of any size and scale. This course shows you how to build Microservices-based applications on AWS. Overcome the limitations and challenges you experience in traditional monolith deployments. By the end of the course, you'll have learned to apply AWS tools to create and deploy Microservices-based applications. You'll be able to make your applications cost-effective, easier to scale, and faster to develop.
The second course, Building a Scalable ServerlessMicroservice REST Data API, covers practical solutions to building Serverless applications. In this course we show you how to build an end-to-end serverless application for your organization. We have selected a data API use case that could reduce costs and give you more flexibility in how you and your clients consume or present your application, metrics and insight data. We make use of the latest serverless deployment and build framework, share our experience on testing, and provide best practices for running a serverless stack in a production environment.
The third course, Implementing ServerlessMicroservices Architecture Patterns, covers implementing Microservices using Serverless Computing on AWS. In this course, We will show you how Serverless computing can be used to implement the majority of the Microservice architecture patterns and when put in a continuous integration & continuous delivery pipeline; can dramatically increase the delivery speed, productivity and flexibility of the development team in your organization, while reducing the overall running, operational and maintenance costs. By the end of the course, you’ll be able to build, test, deploy, scale and monitor your microservices with ease using Serverless computing in a continuous delivery pipeline.
By the end of the course, you’ll create a secure, scalable, and Serverless data API to build highly available Microservices to power applications of any size and scale.
About the Authors
● Alan Rodrigues has been working on software components such as Docker containers and Kubernetes for the last 2 years. He has extensive experience working on the AWS Platform, currently being certified as an AWS Solution Architect Associate, a SysOps Administrator, and a Developer Associate. He has seen that organizations are moving towards using containers as part of their Microservices architecture. And there is a strong need to have a container orchestration tool in place. Kubernetes is by far the most popular container orchestration on the market.
● Richard T. Freeman, PhD currently works for JustGiving, a tech-for-good social platform for online giving that’s helped 25 million users in 164 countries raise $5 billion for good causes. He is also offering independent and short-term freelance cloud architecture & machine learning consultancy services. Richard is a hands-on certified AWS Solutions Architect, Data & Machine Learning Engineer with proven success in delivering cloud-based big data analytics, data science, high-volume, and scalable solutions. At Capgemini, he worked on large and complex projects for Fortune Global 500 companies and has experience in extremely diverse, challenging and multi-cultural business environments. Richard has a solid background in computer science and holds a Master of Engineering (MEng) in computer systems engineering and a Doctorate (Ph.D.) in machine learning, artificial intelligence and natural language processing. See his website for his latest blog posts and speaking engagements. He has worked in nonprofit, insurance, retail banking, recruitment, financial services, financial regulators, central government and e-commerce sectors, where he:
-Provided the delivery, architecture and technical consulting on client site for complex event processing, business intelligence, enterprise content management, and business process management solutions.
-Delivered in-house production cloud-based big data solutions for large-scale graph, machine learning, natural language processing, serverless, cloud data warehousing, ETL data pipeline, recommendation engines, and real-time streaming analytics systems.
-Worked closely with IBM and AWS and presented at industry events and summits, published research articles in numerous journals, presented at conferences and acted as a peer-reviewer.
-Has over four years of production experience with Serverless computing on AWS.
- Developers, software architects, and software engineers. Developers familiar with traditional application development but interested in using Microservices in a DevOps environment will also benefit. Microservices are appropriate to large-scale enterprise environments so this course should appeal to people interested in developing for those environments.
- Developers who need practical solutions to common problems while building their serverless application. Programming knowledge is assumed.
- Developers, architects, DevOps, administrators and operations who would like to deploy Serverless computing and microservices in their organization.