
Practice Exam AWS Certified Solutions Architect Professional
Description
Preparing for AWS Certified Solutions Architect Professional SAP-C02? This is THE practice exams course to give you the winning edge.
These practice exams have been co-authored by Stephane Maarek and Abhishek Singh who bring their collective experience of passing 18 AWS Certifications to the table.
The tone and tenor of the questions mimic the real exam. Along with the detailed description and “exam alert” provided within the explanations, we have also extensively referenced AWS documentation to get you up to speed on all domain areas being tested for the SAP-C02 exam.
We want you to think of this course as the final pit-stop so that you can cross the winning line with absolute confidence and get AWS Certified! Trust our process, you are in good hands.
All questions have been written from scratch! And more questions are being added over time!
=======
Quality speaks for itself...
SAMPLE QUESTION:
A silicon valley based unicorn startup recently launched a video-sharing social networking service called KitKot. The startup uses AWS Cloud to manage the IT infrastructure. Users upload video files up to 1 GB in size to a single EC2 instance based application server which stores them on a shared EFS file system. Another set of EC2 instances managed via an Auto Scaling group, periodically scans the EFS share directory for new files to process and generate new videos (for thumbnails and composite visual effects) according to the video processing instructions that are uploaded alongside the raw video files. Post-processing, the raw video files are deleted from the EFS file system and the results are stored in an S3 bucket. Links to the processed video files are sent via in-app notifications to the users. The startup has recently found that even as more instances are added to the Auto Scaling Group, many files are processed twice, therefore image processing speed is not improved.
As an AWS Certified Solutions Architect Professional, what would you recommend to improve the reliability of the solution as well as eliminate the redundant processing of video files?
Refactor the application to run from Amazon S3 instead of the EFS file system and upload the video files directly to an S3 bucket via an API Gateway based REST API. Configure an S3 trigger to invoke a Lambda function each time a file is uploaded and the Lambda in turn processes the video and stores the processed files in another bucket. Leverage CloudWatch Events to trigger an SNS notification to send an in-app notification to the user containing the links to the processed files
Refactor the application to run from S3 instead of EFS and upload the video files directly to an S3 bucket. Configure an S3 trigger to invoke a Lambda function on each video file upload to S3 that puts a message in an SQS queue containing the link and the video processing instructions. Change the video processing application to read from the SQS queue and the S3 bucket. Configure the queue depth metric to scale the size of the Auto Scaling group for video processing instances. Leverage CloudWatch Events to trigger an SNS notification to the user containing the links to the processed files
Refactor the application to run from S3 instead of EFS and upload the video files directly to an S3 bucket. Set up an EventBridge event to trigger a Lambda function on each file upload that puts a message in an SQS queue containing the link and the video processing instructions. Change the video processing application to read from SQS queue for new files and configure the queue depth metric to scale instances in the video processing Auto Scaling group. Leverage EventBridge events to trigger an SNS notification to the user containing the links to the processed files
Create an hourly cron job on the application server to synchronize the contents of the EFS share with S3. Trigger a Lambda function every time a file is uploaded to S3 and process the video file to store the results in another S3 bucket. Once the file is processed, leverage CloudWatch Events to trigger an SNS notification to send an in-app notification to the user containing the links to the processed files
What's your guess? Scroll below for the answer...
Correct: 2.
Refactor the application to run from S3 instead of EFS and upload the video files directly to an S3 bucket. Configure an S3 trigger to invoke a Lambda function on each video file upload to S3 that puts a message in an SQS queue containing the link and the video processing instructions. Change the video processing application to read from the SQS queue and the S3 bucket. Configure the queue depth metric to scale the size of the Auto Scaling group for video processing instances. Leverage CloudWatch Events to trigger an SNS notification to the user containing the links to the processed files
For the given use-case, the primary way to address the issues related to reliability, as well as redundant processing of video files, is by introducing SQS into the solution stack. SQS offers a secure, durable, and available hosted queue that lets you integrate and decouple distributed software systems and components. SQS locks your messages during processing, so that multiple producers can send and multiple consumers can receive messages at the same time. Using the right combination of delay queues and visibility timeout, you can optimize the solution to address use-cases where the consumer application needs additional time to process messages such as the one in this scenario. Messages are put into the SQS queue via a Lambda function that is triggered when a new video file is uploaded to S3 for processing.
To ensure that the consumer applications running on the video processing instances can scale via an Auto Scaling group, you could use the SQS queue depth (known as the CloudWatch Amazon SQS metric - ApproximateNumberOfMessages) as the underlying metric. However, the issue with using a CloudWatch Amazon SQS metric like ApproximateNumberOfMessagesVisible for target tracking is that the number of messages in the queue might not change proportionally to the size of the Auto Scaling group that processes messages from the queue. An optimized solution would be to use a backlog per instance metric with the target value being the acceptable backlog per instance to maintain.
Incorrect options:
Refactor the application to run from Amazon S3 instead of the EFS file system and upload the video files directly to an S3 bucket via an API Gateway based REST API. Configure an S3 trigger to invoke a Lambda function each time a file is uploaded and the Lambda, in turn, processes the video and stores the processed files in another bucket. Leverage CloudWatch Events to trigger an SNS notification to send an in-app notification to the user containing the links to the processed files - API Gateway supports payload size of only up to 10 MB therefore this option is incorrect for the given use-case since you need to support file sizes of up to 1GB for video processing.
Refactor the application to run from S3 instead of EFS and upload the video files directly to an S3 bucket. Set up an EventBridge event to trigger a Lambda function on each file upload that puts a message in an SQS queue containing the link and the video processing instructions. Change the video processing application to read from SQS queue for new files and configure the queue depth metric to scale instances in the video processing Auto Scaling group. Leverage EventBridge events to trigger an SNS notification to the user containing the links to the processed files - You can certainly configure an EventBridge event to handle a new object upload on S3, which in turn triggers a lambda function. However, this is a roundabout way of propagating the object upload event to the Lambda function. So this is not the best fit option.
Create an hourly cron job on the application server to synchronize the contents of the EFS share with S3. Trigger a Lambda function every time a file is uploaded to S3 and process the video file to store the results in another S3 bucket. Once the file is processed, leverage CloudWatch Events to trigger an SNS notification to send an in-app notification to the user containing the links to the processed files - The issue with this option is lack of reliability. In case the Lambda function (which is triggered when a video file is uploaded to S3) fails to process a given video file, then the source video file would always remain unprocessed as there is no queue-based mechanism to re-process failed events. So this option is incorrect.
< with reference links>
=======
Instructor
My name is Stéphane Maarek, I am passionate about Cloud Computing, and I will be your instructor in this course. I teach about AWS certifications, focusing on helping my students improve their professional proficiencies in AWS.
I have already taught 1,500,000+ students and gotten 500,000+ reviews throughout my career in designing and delivering these certifications and courses!
I'm delighted to welcome Abhishek Singh as my co-instructor for these practice exams!
=======
Welcome to the best practice exams to help you prepare for your AWS Certified Solutions Architect Professional exam.
You can retake the exams as many times as you want
This is a huge original question bank
You get support from instructors if you have questions
Each question has a detailed explanation
Mobile-compatible with the Udemy app
30-days money-back guarantee if you're not satisfied
We hope that by now you're convinced!... And there are a lot more questions inside the course.
Happy learning and best of luck for your AWS Certified Solutions Architect Professional SAP-C02 exam!
Who this course is for:
- Anyone preparing for the AWS Certified Solutions Architect Professional SAP-C01
Instructors
Stephane is a solutions architect, consultant and software developer that has a particular interest in all things related to Big Data, Cloud & API. He's also a many-times best seller instructor on Udemy for his courses in AWS and Apache Kafka.
[See FAQ below to see in which order you can take my courses]
Stéphane is recognized as an AWS Hero and is an AWS Certified Solutions Architect Professional & AWS Certified DevOps Professional. He loves to teach people how to use the AWS properly, to get them ready for their AWS certifications, and most importantly for the real world.
He also loves Apache Kafka. He sits on the 2019 Program Committee organizing the Kafka Summit in New York, London and San Francisco. He is also an active member of the Apache Kafka community, authoring blogs on Medium and a guest blog for Confluent.
During his spare time he enjoys cooking, practicing yoga, surfing, watching TV shows, and traveling to awesome destinations!
FAQ: In which order should you learn?...
AWS Cloud: Start with AWS Certified Solutions Architect Associate, then move on to AWS Certified Developer Associate and then AWS Certified SysOps Administrator. Afterwards you can either do AWS Certified Solutions Architect Professional or AWS Certified DevOps Professional, or a specialty certification of your choosing.
Apache Kafka: Start with Apache Kafka for Beginners, then you can learn Connect, Streams and Schema Registry if you're a developer, and Setup and Monitoring courses if you're an admin. Both tracks are needed to pass the Confluent Kafka certification.
Abhishek is an AWS veteran and has built successful SaaS and consumer solutions using AWS services since 2012. Over the course of his professional career, Abhishek has interviewed and mentored hundreds of candidates for entry-level and lateral positions for Cloud based IT solutions development. Abhishek is passionate about sharing his knowledge on AWS Cloud, Machine Learning and Big Data. He wants to help his fellow IT Professionals level-up their skills to ace the AWS Certifications and above all, get ready for the real world AWS ecosystem.
He is an AWS Certified Solutions Architect Professional, AWS Certified DevOps Engineer Professional, AWS Certified Machine Learning Specialist, AWS Certified Big Data Specialist and AWS Certified Database Specialist.
Overall, Abhishek has over 14 years of experience working on a diverse range of Enterprise Technologies based on ML, Big Data and Analytics. He runs a successful ML and Big Data Consultancy advocating solutions on AWS Cloud and has advised multiple clients in the US to architect and implement their ML and Big Data solutions using the AWS suite of services.