SAA-C03 Online Practice Questions

Home / Amazon / SAA-C03

Latest SAA-C03 Exam Practice Questions

The practice questions for SAA-C03 exam was last updated on 2025-11-01 .

Viewing page 1 out of 37 pages.

Viewing questions 1 out of 185 questions.

Question#1

A company has a serverless web application that is comprised of AWS Lambda functions. The application experiences spikes in traffic that cause increased latency because of cold starts. The
company wants to improve the application's ability to handle traffic spikes and to minimize latency.
The solution must optimize costs during periods when traffic is low.
Which solution will meet these requirements?

A. Configure provisioned concurrency for the Lambda functions. Use AWS Application Auto Scaling to adjust the provisioned concurrency.
B. Launch Amazon EC2 instances in an Auto Scaling group. Add a scheduled scaling policy to launch additional EC2 instances during peak traffic periods.
C. Configure provisioned concurrency for the Lambda functions. Set a fixed concurrency level to handle the maximum expected traffic.
D. Create a recurring schedule in Amazon EventBridge Scheduler. Use the schedule to invoke the Lambda functions periodically to warm the functions.

Explanation:
Provisioned Concurrency:
AWS Lambda’s provisioned concurrency ensures that a predefined number of execution environments are pre-warmed and ready to handle requests, reducing latency during traffic spikes.
This solution optimizes costs during low-traffic periods when combined with AWS Application Auto Scaling to dynamically adjust the provisioned concurrency based ondemand.
Incorrect Options Analysis:
Option B: Switching to EC2 would increase complexity and cost for a serverless application.
Option C: A fixed concurrency level may result in over-provisioning during low-traffic periods, leading to higher costs.
Option D: Periodically warming functions does not effectively handle sudden spikes in traffic.
Reference: AWS Lambda Provisioned Concurrency

Question#2

A company is planning to deploy a managed MySQL database solution for its non-production applications. The company plans to run the system for several years on AWS.
Which solution will meet these requirements MOST cost-effectively?

A. Create an Amazon RDS for MySQL instance. Purchase a Reserved Instance.
B. Create an Amazon RDS for MySQL instance. Use the instance on an on-demand basis.
C. Create an Amazon Aurora MySQL cluster with writer and reader nodes. Use the cluster on an on-demand basis.
D. Create an Amazon EC2 instance. Manually install and configure MySQL Server on the instance.

Explanation:
Amazon RDS for MySQL Reserved Instances provide significant savings over on-demand pricing when you plan to run the database for long periods. This is the most cost-effective option for non-production, long-running managed MySQL workloads.
Reference Extract:
"Reserved Instances provide a significant discount compared to On-Demand pricing and are
recommended for steady-state workloads that run for an extended period."
Source: AWS Certified Solutions Architect C Official Study Guide, RDS Cost Optimization section.

Question#3

A company stores sensitive customer data in an Amazon DynamoDB table. The company frequently updates the data. The company wants to use the data to personalize offers for customers.
The company's analytics team has its own AWS account. The analytics team runs an application on Amazon EC2 instances that needs to process data from the DynamoDB tables. The company needs to follow security best practices to create a process to regularly share data from DynamoDB to the analytics team.
Which solution will meet these requirements?

A. Export the required data from the DynamoDB table to an Amazon S3 bucket as multiple JSON files. Provide the analytics team with the necessary IAM permissions to access the S3 bucket.
B. Allow public access to the DynamoDB table. Create an IAM user that has permission to access DynamoD
C. Share the IAM user with the analytics team.
D. Allow public access to the DynamoDB table. Create an IAM user that has read-only permission for DynamoD
E. Share the IAM user with the analytics team.
F. Create a cross-account IAM role. Create an IAM policy that allows the AWS account ID of the analytics team to access the DynamoDB table. Attach the IAM policy to the IAM role. Establish a trust relationship between accounts.

Explanation:
Usingcross-account IAM rolesis the most secure and scalable way to share data between AWS accounts.
Atrust relationshipallows the analytics team's account to assume the role in the main account and access the DynamoDB table directly.
Ais feasible but involves data duplication and additional costs for storing the JSON files in S3.
B and Cviolate security best practices by allowing public access to sensitive data and sharing credentials, which is highly discouraged.
AWS Documentation
Reference: Cross-Account Access with Roles
Best Practices for Amazon DynamoDB Security

Question#4

A company wants to publish a private website for its on-premises employees. The website consists of several HTML pages and image files. The website must be available only through HTTPS and must be available only to on-premises employees. A solutions architect plans to store the website files in an Amazon S3 bucket.
Which solution will meet these requirements?

A. Create an S3 bucket policy to deny access when the source IP address is not the public IP address of the on-premises environment Set up an Amazon Route 53 alias record to point to the S3 bucket. Provide the alias record to the on-premises employees to grant the employees access to the website.
B. Create an S3 access point to provide website access. Attach an access point policy to deny access when the source IP address is not the public IP address of the on-premises environment. Provide the S3 access point alias to the on-premises employees to grant the employees access to the website.
C. Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Use AWS Certificate Manager for SS
D. Use AWS WAF with an IP set rule that allows access for the on-premises IP address. Set up an Amazon Route 53 alias record to point to the CloudFront distribution.
E. Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Create a CloudFront signed URL for the objects in the bucket. Set up an Amazon Route 53 alias record to point to the CloudFront distribution. Provide the signed URL to the on-premises employees to grant the employees access to the website.

Explanation:
This solution uses CloudFront to serve the website securely over HTTPS using AWS Certificate Manager (ACM)for SSL certificates. Origin Access Control (OAC)ensures that only CloudFront can access the S3 bucket directly.AWS WAFwith an IP set rule restricts access to the website, allowing only the on-premises IP address. Route 53is used to create an alias record pointing to the CloudFront distribution. This setup ensures secure, private access to the website with low administrative overhead.
Option A and B: S3 bucket policies and access points do not provide HTTPS support, nor do they offer the same level of security as CloudFront with WAF.
Option D: Signed URLs are more suitable for temporary, expiring access rather than a permanent solution for on-premises employees.
AWS
Reference: Amazon CloudFront with Origin Access Control

Question#5

A company runs an application that stores and shares photos. Users upload the photos to an Amazon S3 bucket. Every day, users upload approximately 150 photos. The company wants to design a solution that creates a thumbnail of each new photo and stores the thumbnail in a second S3 bucket.
Which solution will meet these requirements MOST cost-effectively?

A. Configure an Amazon EventBridge scheduled rule to invoke a scrip! every minute on a long-running Amazon EMR cluster. Configure the script to generate thumbnails for the photos that do not have thumbnails. Configure the script to upload the thumbnails to the second S3 bucket.
B. Configure an Amazon EventBridge scheduled rule to invoke a script every minute on a memory-optimized Amazon EC2 instance that is always on. Configure the script to generate thumbnails for the photos that do not have thumbnails. Configure the script to upload the thumbnails to the second S3 bucket.
C. Configure an S3 event notification to invoke an AWS Lambda function each time a user uploads a new photo to the application. Configure the Lambda function to generate a thumbnail and to upload the thumbnail to the second S3 bucket.
D. Configure S3 Storage Lens to invoke an AWS Lambda function each time a user uploads a new photo to the application. Configure the Lambda function to generate a thumbnail and to upload the thumbnail to a second S3 bucket.

Explanation:
The most cost-effective and scalable solution for generating thumbnails when photos are uploaded to an S3 bucket is to useS3 event notificationsto trigger an AWS Lambda function. This approach avoids the need for a long-running EC2 instance or EMR cluster, making it highly cost-effective because Lambda only charges for the time it takes to process each event.
S3 Event Notifications: Automatically triggers the Lambda function when a new photo is uploaded to the S3 bucket.
AWS Lambda: A serverless compute service that scales automatically and only charges for execution time, which makes it the most economical choice when dealing with periodic events like photo uploads.
The Lambda function can generate the thumbnail and upload it to a second S3 bucket, fulfilling the requirement efficiently.
Option A and Option B(EMR or EC2 with scheduled scripts): These are less cost-effective as they involve continuously running infrastructure, which incurs unnecessary costs.
Option D (S3 Storage Lens): S3 Storage Lens is a tool for storage analytics and is not designed for event-based photo processing.
AWS
Reference: Amazon S3 Event Notifications
AWS Lambda Pricing

Exam Code: SAA-C03Q & A: 527 Q&AsUpdated:  2025-11-01

 Get All SAA-C03 Q&As