SAA-C03 dumps
5 Star


Customer Rating & Feedbacks
98%


Exactly Questions Came From Dumps

Amazon SAA-C03 Question Answers

AWS Certified Solutions Architect - Associate (SAA-C03) Dumps April 2024

Are you tired of looking for a source that'll keep you updated on the AWS Certified Solutions Architect - Associate (SAA-C03) Exam? Plus, has a collection of affordable, high-quality, and incredibly easy Amazon SAA-C03 Practice Questions? Well then, you are in luck because Salesforcexamdumps.com just updated them! Get Ready to become a AWS Certified Associate Certified.

discount banner
PDF $100  $40
Test Engine $140  $56
PDF + Test Engine $180  $72

Here are Amazon SAA-C03 PDF available features:

683 questions with answers Updation Date : 15 Apr, 2024
1 day study required to pass exam 100% Passing Assurance
100% Money Back Guarantee Free 3 Months Updates
Last 24 Hours Result
91

Students Passed

91%

Average Marks

95%

Questions From Dumps

4166

Total Happy Clients

What is Amazon SAA-C03?

Amazon SAA-C03 is a necessary certification exam to get certified. The certification is a reward to the deserving candidate with perfect results. The AWS Certified Associate Certification validates a candidate's expertise to work with Amazon. In this fast-paced world, a certification is the quickest way to gain your employer's approval. Try your luck in passing the AWS Certified Solutions Architect - Associate (SAA-C03) Exam and becoming a certified professional today. Salesforcexamdumps.com is always eager to extend a helping hand by providing approved and accepted Amazon SAA-C03 Practice Questions. Passing AWS Certified Solutions Architect - Associate (SAA-C03) will be your ticket to a better future!

Pass with Amazon SAA-C03 Braindumps!

Contrary to the belief that certification exams are generally hard to get through, passing AWS Certified Solutions Architect - Associate (SAA-C03) is incredibly easy. Provided you have access to a reliable resource such as Salesforcexamdumps.com Amazon SAA-C03 PDF. We have been in this business long enough to understand where most of the resources went wrong. Passing Amazon AWS Certified Associate certification is all about having the right information. Hence, we filled our Amazon SAA-C03 Dumps with all the necessary data you need to pass. These carefully curated sets of AWS Certified Solutions Architect - Associate (SAA-C03) Practice Questions target the most repeated exam questions. So, you know they are essential and can ensure passing results. Stop wasting your time waiting around and order your set of Amazon SAA-C03 Braindumps now!

We aim to provide all AWS Certified Associate certification exam candidates with the best resources at minimum rates. You can check out our free demo before pressing down the download to ensure Amazon SAA-C03 Practice Questions are what you wanted. And do not forget about the discount. We always provide our customers with a little extra.

Why Choose Amazon SAA-C03 PDF?

Unlike other websites, Salesforcexamdumps.com prioritize the benefits of the AWS Certified Solutions Architect - Associate (SAA-C03) candidates. Not every Amazon exam candidate has full-time access to the internet. Plus, it's hard to sit in front of computer screens for too many hours. Are you also one of them? We understand that's why we are here with the AWS Certified Associate solutions. Amazon SAA-C03 Question Answers offers two different formats PDF and Online Test Engine. One is for customers who like online platforms for real-like Exam stimulation. The other is for ones who prefer keeping their material close at hand. Moreover, you can download or print Amazon SAA-C03 Dumps with ease.

If you still have some queries, our team of experts is 24/7 in service to answer your questions. Just leave us a quick message in the chat-box below or email at [email protected].

Amazon SAA-C03 Sample Questions

Question # 1

A company is developing a mobile game that streams score updates to a backendprocessor and then posts results on a leaderboard A solutions architect needs to design asolution that can handle large traffic spikes process the mobile game updates in order ofreceipt, and store the processed updates in a highly available database The company alsowants to minimize the management overhead required to maintain the solutionWhat should the solutions architect do to meet these requirements?

A. Push score updates to Amazon Kinesis Data Streams Process the updates in KinesisData Streams with AWS Lambda Store the processed updates in Amazon DynamoDB.
B. Push score updates to Amazon Kinesis Data Streams. Process the updates with a fleetof Amazon EC2 instances set up for Auto Scaling Store the processed updates in AmazonRedshift.
C. Push score updates to an Amazon Simple Notification Service (Amazon SNS) topicSubscribe an AWS Lambda function to the SNS topic to process the updates. Store theprocessed updates in a SQL database running on Amazon EC2.
D. Push score updates to an Amazon Simple Queue Service (Amazon SQS) queue. Use afleet of Amazon EC2 instances with Auto Scaling to process the updates in the SQSqueue. Store the processed updates in an Amazon RDS Multi-AZ DB instance.


Question # 2

A company runs an SMB file server in its data center. The file server stores large files thatthe company frequently accesses for up to 7 days after the file creation date. After 7 days,the company needs to be able to access the files with a maximum retrieval time of 24hours.Which solution will meet these requirements?

A. Use AWS DataSync to copy data that is older than 7 days from the SMB file server toAWS.
B. Create an Amazon S3 File Gateway to increase the company's storage space. Createan S3 Lifecycle policy to transition the data to S3 Glacier Deep Archive after 7 days.
C. Create an Amazon FSx File Gateway to increase the company's storage space. Createan Amazon S3 Lifecycle policy to transition the data after 7 days.
D. Configure access to Amazon S3 for each user. Create an S3 Lifecycle policy totransition the data to S3 Glacier Flexible Retrieval after 7 days.


Question # 3

A company has an organization in AWS Organizations that has all features enabled Thecompany requires that all API calls and logins in any existing or new AWS account must beaudited The company needs a managed solution to prevent additional work and tominimize costs The company also needs to know when any AWS account is not compliantwith the AWS Foundational Security Best Practices (FSBP) standard.Which solution will meet these requirements with the LEAST operational overhead?

A. Deploy an AWS Control Tower environment in the Organizations management accountEnable AWS Security Hub and AWS Control Tower Account Factory in the environment.
B. Deploy an AWS Control Tower environment in a dedicated Organizations memberaccount Enable AWS Security Hub and AWS Control Tower Account Factory in theenvironment.
C. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision Amazon GuardDuty in the MALZ.
D. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision AWS Security Hub in the MALZ.


Question # 4

A solutions architect is designing a user authentication solution for a company The solutionmust invoke two-factor authentication for users that log in from inconsistent geographicallocations. IP addresses, or devices. The solution must also be able to scale up toaccommodate millions of users.Which solution will meet these requirements'?

A. Configure Amazon Cognito user pools for user authentication Enable the nsk-basedadaptive authentication feature with multi-factor authentication (MFA)
B. Configure Amazon Cognito identity pools for user authentication Enable multi-factorauthentication (MFA).
C. Configure AWS Identity and Access Management (1AM) users for user authenticationAttach an 1AM policy that allows the AllowManageOwnUserMFA action
D. Configure AWS 1AM Identity Center (AWS Single Sign-On) authentication for userauthentication Configure the permission sets to require multi-factor authentication(MFA)


Question # 5

A solutions architect needs to design the architecture for an application that a vendorprovides as a Docker container image The container needs 50 GB of storage available fortemporary files The infrastructure must be serverless.Which solution meets these requirements with the LEAST operational overhead?

A. Create an AWS Lambda function that uses the Docker container image with an AmazonS3 mounted volume that has more than 50 GB of space
B. Create an AWS Lambda function that uses the Docker container image with an AmazonElastic Block Store (Amazon EBS) volume that has more than 50 GB of space
C. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses the AWSFargate launch type Create a task definition for the container image with an AmazonElastic File System (Amazon EFS) volume. Create a service with that task definition.
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses theAmazon EC2 launch type with an Amazon Elastic Block Store (Amazon EBS) volume thathas more than 50 GB of space Create a task definition for the container image. Create aservice with that task definition.


Question # 6

A company uses AWS Organizations to run workloads within multiple AWS accounts Atagging policy adds department tags to AWS resources when the company creates tags.An accounting team needs to determine spending on Amazon EC2 consumption Theaccounting team must determine which departments are responsible for the costsregardless of AWS account The accounting team has access to AWS Cost Explorer for allAWS accounts within the organization and needs to access all reports from Cost Explorer.Which solution meets these requirements in the MOST operationally efficient way'?

A. From the Organizations management account billing console, activate a user-definedcost allocation tag named department Create one cost report in Cost Explorer grouping by tag name, and filter by EC2.
B. From the Organizations management account billing console, activate an AWS-definedcost allocation tag named department. Create one cost report in Cost Explorer grouping bytag name, and filter by EC2.
C. From the Organizations member account billing console, activate a user-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by thetag name, and filter by EC2.
D. From the Organizations member account billing console, activate an AWS-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by tagname and filter by EC2.


Question # 7

A company is building an Amazon Elastic Kubernetes Service (Amazon EKS) cluster for itsworkloads. All secrets that are stored in Amazon EKS must be encrypted in the Kubernetesetcd key-value store.Which solution will meet these requirements?

A. Create a new AWS Key Management Service (AWS KMS) key Use AWS SecretsManager to manage rotate, and store all secrets in Amazon EKS.
B. Create a new AWS Key Management Service (AWS KMS) key Enable Amazon EKSKMS secrets encryption on the Amazon EKS cluster.
C. Create the Amazon EKS cluster with default options Use the Amazon Elastic BlockStore (Amazon EBS) Container Storage Interface (CSI) driver as an add-on.
D. Create a new AWS Key Management Service (AWS KMS) key with the ahas/aws/ebsalias Enable default Amazon Elastic Block Store (Amazon EBS) volume encryption for theaccount.


Question # 8

A retail company has several businesses. The IT team for each business manages its ownAWS account. Each team account is part of an organization in AWS Organizations. Eachteam monitors its product inventory levels in an Amazon DynamoDB table in the team'sown AWS account.The company is deploying a central inventory reporting application into a shared AWSaccount. The application must be able to read items from all the teams' DynamoDB tables.Which authentication option will meet these requirements MOST securely?

A. Integrate DynamoDB with AWS Secrets Manager in the inventory application account.Configure the application to use the correct secret from Secrets Manager to authenticateand read the DynamoDB table. Schedule secret rotation for every 30 days.
B. In every business account, create an 1AM user that has programmatic access.Configure the application to use the correct 1AM user access key ID and secret access keyto authenticate and read the DynamoDB table. Manually rotate 1AM access keys every 30days.
C. In every business account, create an 1AM role named BU_ROLE with a policy that givesthe role access to the DynamoDB table and a trust policy to trust a specific role in theinventory application account. In the inventory account, create a role named APP_ROLEthat allows access to the STS AssumeRole API operation. Configure the application to useAPP_ROLE and assume the cross-account role BU_ROLE to read the DynamoDB table.
D. Integrate DynamoDB with AWS Certificate Manager (ACM). Generate identitycertificates to authenticate DynamoDB. Configure the application to use the correctcertificate to authenticate and read the DynamoDB table.


Question # 9

A company built an application with Docker containers and needs to run the application inthe AWS Cloud The company wants to use a managed sen/ice to host the applicationThe solution must scale in and out appropriately according to demand on the individualcontainer services The solution also must not result in additional operational overhead orinfrastructure to manageWhich solutions will meet these requirements? (Select TWO)

A. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate.
B. Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate.
C. Provision an Amazon API Gateway API Connect the API to AWS Lambda to run the containers.
D. Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes.
E. Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 workernodes.


Question # 10

A company uses Amazon S3 as its data lake. The company has a new partner that mustuse SFTP to upload data files A solutions architect needs to implement a highly availableSFTP solution that minimizes operational overhead.Which solution will meet these requirements?

A. Use AWS Transfer Family to configure an SFTP-enabled server with a publiclyaccessible endpoint Choose the S3 data lake as the destination
B. Use Amazon S3 File Gateway as an SFTP server Expose the S3 File Gateway endpointURL to the new partner Share the S3 File Gateway endpoint with the newpartner
C. Launch an Amazon EC2 instance in a private subnet in a VPC. Instruct the new partnerto upload files to the EC2 instance by using a VPN. Run a cron job script on the EC2instance to upload files to the S3 data lake
D. Launch Amazon EC2 instances in a private subnet in a VPC. Place a Network LoadBalancer (NLB) in front of the EC2 instances. Create an SFTP listener port for the NLB Share the NLB hostname with the new partner Run a cron job script on the EC2 instancesto upload files to the S3 data lake.


Question # 11

A company hosts an application used to upload files to an Amazon S3 bucket Onceuploaded, the files are processed to extract metadata which takes less than 5 seconds Thevolume and frequency of the uploads varies from a few files each hour to hundreds ofconcurrent uploads The company has asked a solutions architect to design a cost-effectivearchitecture that will meet these requirements.What should the solutions architect recommend?

A. Configure AWS CloudTrail trails to tog S3 API calls Use AWS AppSync to process thefiles.
B. Configure an object-created event notification within the S3 bucket to invoke an AWSLambda function to process the files.
C. Configure Amazon Kinesis Data Streams to process and send data to Amazon S3.Invoke an AWS Lambda function to process the files.
D. Configure an Amazon Simple Notification Service (Amazon SNS) topic to process thefiles uploaded to Amazon S3 Invoke an AWS Lambda function to process the files.


Question # 12

A company runs analytics software on Amazon EC2 instances The software accepts jobrequests from users to process data that has been uploaded to Amazon S3 Users reportthat some submitted data is not being processed Amazon CloudWatch reveals that theEC2 instances have a consistent CPU utilization at or near 100% The company wants toimprove system performance and scale the system based on user load.What should a solutions architect do to meet these requirements?

A. Create a copy of the instance Place all instances behind an Application Load Balancer
B. Create an S3 VPC endpoint for Amazon S3 Update the software to reference theendpoint
C. Stop the EC2 instances. Modify the instance type to one with a more powerful CPU andmore memory. Restart the instances.
D. Route incoming requests to Amazon Simple Queue Service (Amazon SQS) Configurean EC2 Auto Scaling group based on queue size Update the software to read from the queue.


Question # 13

A company is deploying an application that processes streaming data in near-real time Thecompany plans to use Amazon EC2 instances for the workload The network architecturemust be configurable to provide the lowest possible latency between nodesWhich combination of network solutions will meet these requirements? (Select TWO)

A. Enable and configure enhanced networking on each EC2 instance
B. Group the EC2 instances in separate accounts
C. Run the EC2 instances in a cluster placement group
D. Attach multiple elastic network interfaces to each EC2 instance
E. Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.


Question # 14

A company runs a container application on a Kubernetes cluster in the company's datacenter The application uses Advanced Message Queuing Protocol (AMQP) tocommunicate with a message queue The data center cannot scale fast enough to meet thecompany's expanding business needs The company wants to migrate the workloads toAWSWhich solution will meet these requirements with the LEAST operational overhead? \

A. Migrate the container application to Amazon Elastic Container Service (Amazon ECS)Use Amazon Simple Queue Service (Amazon SQS) to retrieve the messages.
B. Migrate the container application to Amazon Elastic Kubernetes Service (Amazon EKS)Use Amazon MQ to retrieve the messages.
C. Use highly available Amazon EC2 instances to run the application Use Amazon MQ toretrieve the messages.
D. Use AWS Lambda functions to run the application Use Amazon Simple Queue Service(Amazon SQS) to retrieve the messages.


Question # 15

A company runs a real-time data ingestion solution on AWS. The solution consists of themost recent version of Amazon Managed Streaming for Apache Kafka (Amazon MSK). Thesolution is deployed in a VPC in private subnets across three Availability Zones.A solutions architect needs to redesign the data ingestion solution to be publicly availableover the internet. The data in transit must also be encrypted.Which solution will meet these requirements with the MOST operational efficiency?

A. Configure public subnets in the existing VPC. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
B. Create a new VPC that has public subnets. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
C. Deploy an Application Load Balancer (ALB) that uses private subnets. Configure an ALBsecurity group inbound rule to allow inbound traffic from the VPC CIDR block for HTTPSprotocol.
D. Deploy a Network Load Balancer (NLB) that uses private subnets. Configure an NLBlistener for HTTPS communication over the internet.


Question # 16

A company runs a Java-based job on an Amazon EC2 instance. The job runs every hourand takes 10 seconds to run. The job runs on a scheduled interval and consumes 1 GB ofmemory. The CPU utilization of the instance is low except for short surges during which thejob uses the maximum CPU available. The company wants to optimize the costs to run thejob.Which solution will meet these requirements?

A. Use AWS App2Container (A2C) to containerize the job. Run the job as an AmazonElastic Container Service (Amazon ECS) task on AWS Fargate with 0.5 virtual CPU(vCPU) and 1 GB of memory.
B. Copy the code into an AWS Lambda function that has 1 GB of memory. Create anAmazon EventBridge scheduled rule to run the code each hour.
C. Use AWS App2Container (A2C) to containerize the job. Install the container in theexisting Amazon Machine Image (AMI). Ensure that the schedule stops the container whenthe task finishes.
D. Configure the existing schedule to stop the EC2 instance at the completion of the joband restart the EC2 instance when the next job starts.


Question # 17

An ecommerce company runs applications in AWS accounts that are part of anorganization in AWS Organizations The applications run on Amazon Aurora PostgreSQLdatabases across all the accounts The company needs to prevent malicious activity andmust identify abnormal failed and incomplete login attempts to the databasesWhich solution will meet these requirements in the MOST operationally efficient way?

A. Attach service control policies (SCPs) to the root of the organization to identify the failedlogin attempts
B. Enable the Amazon RDS Protection feature in Amazon GuardDuty for the memberaccounts of the organization
C. Publish the Aurora general logs to a log group in Amazon CloudWatch Logs Export thelog data to a central Amazon S3 bucket
D. Publish all the Aurora PostgreSQL database events in AWS CloudTrail to a centralAmazon S3 bucket


Question # 18

A company needs to provide customers with secure access to its data. The companyprocesses customer data and stores the results in an Amazon S3 bucket.All the data is subject to strong regulations and security requirements. The data must beencrypted at rest. Each customer must be able to access only their data from their AWSaccount. Company employees must not be able to access the data.Which solution will meet these requirements?

A. Provision an AWS Certificate Manager (ACM) certificate for each customer. Encrypt thedata client-side. In the private certificate policy, deny access to the certificate for allprincipals except an 1AM role that the customer provides.
B. Provision a separate AWS Key Management Service (AWS KMS) key for eachcustomer. Encrypt the data server-side. In the S3 bucket policy, deny decryption of data forall principals except an 1AM role that the customer provides.
C. Provision a separate AWS Key Management Service (AWS KMS) key for eachcustomer. Encrypt the data server-side. In each KMS key policy, deny decryption of datafor all principals except an 1AM role that the customer provides.
D. Provision an AWS Certificate Manager (ACM) certificate for each customer. Encrypt thedata client-side. In the public certificate policy, deny access to the certificate for allprincipals except an 1AM role that the customer provides.


Question # 19

A company has a nightly batch processing routine that analyzes report files that an onpremisesfile system receives daily through SFTP. The company wants to move thesolution to the AWS Cloud. The solution must be highly available and resilient. The solutionalso must minimize operational effort.Which solution meets these requirements?

A. Deploy AWS Transfer for SFTP and an Amazon Elastic File System (Amazon EFS) filesystem for storage. Use an Amazon EC2 instance in an Auto Scaling group with ascheduled scaling policy to run the batch operation.
B. Deploy an Amazon EC2 instance that runs Linux and an SFTP service. Use an AmazonElastic Block Store {Amazon EBS) volume for storage. Use an Auto Scaling group with theminimum number of instances and desired number of instances set to 1.
C. Deploy an Amazon EC2 instance that runs Linux and an SFTP service. Use an AmazonElastic File System (Amazon EFS) file system for storage. Use an Auto Scaling group withthe minimum number of instances and desired number of instances set to 1.
D. Deploy AWS Transfer for SFTP and an Amazon S3 bucket for storage. Modify theapplication to pull the batch files from Amazon S3 to an Amazon EC2 instance forprocessing. Use an EC2 instance in an Auto Scaling group with a scheduled scaling policyto run the batch operation.


Question # 20

A company uses high concurrency AWS Lambda functions to process a constantlyincreasing number of messages in a message queue during marketing events. TheLambda functions use CPU intensive code to process the messages. The company wantsto reduce the compute costs and to maintain service latency for its customers.Which solution will meet these requirements?

A. Configure reserved concurrency for the Lambda functions. Decrease the memoryallocated to the Lambda functions.
B. Configure reserved concurrency for the Lambda functions. Increase the memoryaccording to AWS Compute Optimizer recommendations.
C. Configure provisioned concurrency for the Lambda functions. Decrease the memoryallocated to the Lambda functions.
D. Configure provisioned concurrency for the Lambda functions. Increase the memoryaccording to AWS Compute Optimizer recommendations.


Question # 21

A company runs applications on AWS that connect to the company's Amazon RDSdatabase. The applications scale on weekends and at peak times of the year. Thecompany wants to scale the database more effectively for its applications that connect tothe database.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon DynamoDB with connection pooling with a target group configuration forthe database. Change the applications to use the DynamoDB endpoint.
B. Use Amazon RDS Proxy with a target group for the database. Change the applicationsto use the RDS Proxy endpoint.
C. Use a custom proxy that runs on Amazon EC2 as an intermediary to the database.Change the applications to use the custom proxy endpoint.
D. Use an AWS Lambda function to provide connection pooling with a target groupconfiguration for the database. Change the applications to use the Lambda function.


Question # 22

A company wants to run its payment application on AWS The application receives paymentnotifications from mobile devices Payment notifications require a basic validation beforethey are sent for further processingThe backend processing application is long running and requires compute and memory tobe adjusted The company does not want to manage the infrastructureWhich solution will meet these requirements with the LEAST operational overhead?

A. Create an Amazon Simple Queue Service (Amazon SQS) queue Integrate the queuewith an Amazon EventBndge rule to receive payment notifications from mobile devicesConfigure the rule to validate payment notifications and send the notifications to the backend application Deploy the backend application on Amazon Elastic KubernetesService (Amazon EKS) Anywhere Create a standalone cluster
B. Create an Amazon API Gateway API Integrate the API with anAWS Step Functionsstate machine to receive payment notifications from mobile devices Invoke the statemachine to validate payment notifications and send the notifications to the backendapplication Deploy the backend application on Amazon Elastic Kubernetes Sen/ice(Amazon EKS). Configure an EKS cluster with self-managed nodes.
C. Create an Amazon Simple Queue Sen/ice (Amazon SQS) queue Integrate the queuewith an Amazon EventBridge rule to receive payment notifications from mobile devicesConfigure the rule to validate payment notifications and send the notifications to thebackend application Deploy the backend application on Amazon EC2 Spot InstancesConfigure a Spot Fleet with a default allocation strategy.
D. Create an Amazon API Gateway API Integrate the API with AWS Lambda to receivepayment notifications from mobile devices Invoke a Lambda function to validate paymentnotifications and send the notifications to the backend application Deploy the backendapplication on Amazon Elastic Container Service (Amazon ECS). Configure Amazon ECSwith an AWS Fargate launch type.


Question # 23

A company has multiple AWS accounts with applications deployed in the us-west-2 RegionApplication logs are stored within Amazon S3 buckets in each account The company wants to build a centralized log analysis solution that uses a single S3 bucket Logs must not leaveus-west-2, and the company wants to incur minimal operational overheadWhich solution meets these requirements and is MOST cost-effective?

A. Create an S3 Lifecycle policy that copies the objects from one of the application S3buckets to the centralized S3 bucket
B. Use S3 Same-Region Replication to replicate logs from the S3 buckets to another S3bucket in us-west-2 Use this S3 bucket for log analysis.
C. Write a script that uses the PutObject API operation every day to copy the entirecontents of the buckets to another S3 bucket in us-west-2 Use this S3 bucket for loganalysis.
D. Write AWS Lambda functions in these accounts that are triggered every time logs aredelivered to the S3 buckets (s3 ObjectCreated a event) Copy the logs to another S3 bucketin us-west-2. Use this S3 bucket for log analysis.


Question # 24

A company runs a highly available web application on Amazon EC2 instances behind anApplication Load Balancer The company uses Amazon CloudWatch metricsAs the traffic to the web application Increases, some EC2 instances become overloadedwith many outstanding requests The CloudWatch metrics show that the number of requestsprocessed and the time to receive the responses from some EC2 instances are both highercompared to other EC2 instances The company does not want new requests to beforwarded to the EC2 instances that are already overloaded.Which solution will meet these requirements?

A. Use the round robin routing algorithm based on the RequestCountPerTarget and ActiveConnection Count CloudWatch metrics.
B. Use the least outstanding requests algorithm based on the RequestCountPerTarget andActiveConnectionCount CloudWatch metrics.
C. Use the round robin routing algorithm based on the RequestCount andTargetResponseTime CloudWatch metrics.
D. Use the least outstanding requests algorithm based on the RequestCount andTargetResponseTime CloudWatch metrics.


Question # 25

An analytics company uses Amazon VPC to run its multi-tier services. The company wantsto use RESTful APIs to offer a web analytics service to millions of users. Users must beverified by using an authentication service to access the APIs.Which solution will meet these requirements with the MOST operational efficiency?

A. Configure an Amazon Cognito user pool for user authentication. Implement Amazon APIGateway REST APIs with a Cognito authorizer.
B. Configure an Amazon Cognito identity pool for user authentication. Implement AmazonAPI Gateway HTTP APIs with a Cognito authorizer.
C. Configure an AWS Lambda function to handle user authentication. Implement AmazonAPI Gateway REST APIs with a Lambda authorizer.
D. Configure an 1AM user to handle user authentication. Implement Amazon API GatewayHTTP APIs with an 1AM authorizer.


Question # 26

A company has an AWS Direct Connect connection from its on-premises location to anAWS account The AWS account has 30 different VPCs in the same AWS Region TheVPCs use private virtual interfaces (VIFs) Each VPC has a CIDR block that does notoverlap with other networks under the company's controlThe company wants to centrally manage the networking architecture while still allowingeach VPC to communicate with all other VPCs and on-premises networksWhich solution will meet these requirements with the LEAST amount of operationaloverhead?

A. Create a transit gateway and associate the Direct Connect connection with a new transitVIF Turn on the transit gateway's route propagation feature
B. Create a Direct Connect gateway Recreate the private VIFs to use the new gatewayAssociate each VPC by creating new virtual private gateways
C. Create a transit VPC Connect the Direct Connect connection to the transit VPC Create apeenng connection between all other VPCs in the Region Update the route tables
D. Create AWS Site-to-Site VPN connections from on premises to each VPC Ensure thatboth VPN tunnels are UP for each connection Turn on the route propagation feature


Question # 27

A solutions architect is designing a shared storage solution for a web application that isdeployed across multiple Availability Zones The web application runs on Amazon EC2instances that are in an Auto Scaling group The company plans to make frequent changesto the content The solution must have strong consistency in returning the new content assoon as the changes occur.Which solutions meet these requirements? (Select TWO)

A. Use AWS Storage Gateway Volume Gateway Internet Small Computer SystemsInterface (iSCSI) block storage that is mounted to the individual EC2 instances
B. Create an Amazon Elastic File System (Amazon EFS) file system Mount the EFS filesystem on the individual EC2 instances
C. Create a shared Amazon Elastic Block Store (Amazon EBS) volume. Mount the EBSvolume on the individual EC2 instances.
D. Use AWS DataSync to perform continuous synchronization of data between EC2 hostsin the Auto Scaling group
E. Create an Amazon S3 bucket to store the web content Set the metadata for the Cache-Control header to no-cache Use Amazon CloudFront to deliver the content


Question # 28

A company needs to extract the names of ingredients from recipe records that are storedas text files in an Amazon S3 bucket A web application will use the ingredient names toquery an Amazon DynamoDB table and determine a nutrition score.The application can handle non-food records and errors The company does not have anyemployees who have machine learning knowledge to develop this solutionWhich solution will meet these requirements MOST cost-effectively?

A. Use S3 Event Notifications to invoke an AWS Lambda function when PutObjectrequests occur Program the Lambda function to analyze the object and extract theingredient names by using Amazon Comprehend Store the Amazon Comprehend output inthe DynamoDB table.
B. Use an Amazon EventBridge rule to invoke an AWS Lambda function when PutObjectrequests occur. Program the Lambda function to analyze the object by using AmazonForecast to extract the ingredient names Store the Forecast output in the DynamoDB table.
C. Use S3 Event Notifications to invoke an AWS Lambda function when PutObjectrequests occur Use Amazon Polly to create audio recordings of the recipe records. Savethe audio files in the S3 bucket Use Amazon Simple Notification Service (Amazon SNS) tosend a URL as a message to employees Instruct the employees to listen to the audio filesand calculate the nutrition score Store the ingredient names in the DynamoDB table.
D. Use an Amazon EventBridge rule to invoke an AWS Lambda function when a PutObjectrequest occurs Program the Lambda function to analyze the object and extract theingredient names by using Amazon SageMaker Store the inference output from theSageMaker endpoint in the DynamoDB table.


Question # 29

A company has a new mobile app. Anywhere in the world, users can see local news ontopics they choose. Users also can post photos and videos from inside the app.Users access content often in the first minutes after the content is posted. New contentquickly replaces older content, and then the older content disappears. The local nature ofthe news means that users consume 90% of the content within the AWS Region where it isuploaded.Which solution will optimize the user experience by providing the LOWEST latency forcontent uploads?

A. Upload and store content in Amazon S3. Use Amazon CloudFront for the uploads.
B. Upload and store content in Amazon S3. Use S3 Transfer Acceleration for the uploads.
C. Upload content to Amazon EC2 instances in the Region that is closest to the user. Copythe data to Amazon S3.
D. Upload and store content in Amazon S3 in the Region that is closest to the user. Usemultiple distributions of Amazon CloudFront.


Question # 30

An ecommerce application uses a PostgreSQL database that runs on an Amazon EC2instance. During a monthly sales event, database usage increases and causes databaseconnection issues for the application. The traffic is unpredictable for subsequent monthlysales events, which impacts the sales forecast. The company needs to maintainperformance when there is an unpredictable increase in traffic.Which solution resolves this issue in the MOST cost-effective way?

A. Migrate the PostgreSQL database to Amazon Aurora Serverless v2.
B. Enable auto scaling for the PostgreSQL database on the EC2 instance to accommodateincreased usage.
C. Migrate the PostgreSQL database to Amazon RDS for PostgreSQL with a largerinstance type
D. Migrate the PostgreSQL database to Amazon Redshift to accommodate increasedusage


Question # 31

A company's marketing data is uploaded from multiple sources to an Amazon S3 bucket A series ot data preparation jobs aggregate the data for reporting The data preparation jobsneed to run at regular intervals in parallel A few jobs need to run in a specific order laterThe company wants to remove the operational overhead of job error handling retry logic,and state managementWhich solution will meet these requirements?

A. Use an AWS Lambda function to process the data as soon as the data is uploaded tothe S3 bucket Invoke Other Lambda functions at regularly scheduled intervals
B. Use Amazon Athena to process the data Use Amazon EventBndge Scheduler to invokeAthena on a regular internal
C. Use AWS Glue DataBrew to process the data Use an AWS Step Functions statemachine to run the DataBrew data preparation jobs
D. Use AWS Data Pipeline to process the data. Schedule Data Pipeline to process the dataonce at midnight.


Question # 32

A research company uses on-premises devices to generate data for analysis. Thecompany wants to use the AWS Cloud to analyze the data. The devices generate .csv filesand support writing the data to SMB file share. Company analysts must be able to use SQLcommands to query the data. The analysts will run queries periodically throughout the day.Which combination of steps will meet these requirements MOST cost-effectively? (SelectTHREE.)

A. Deploy an AWS Storage Gateway on premises in Amazon S3 File Gateway mode.
B. Deploy an AWS Storage Gateway on premises in Amazon FSx File Gateway mode.
C. Set up an AWS Glue crawler to create a table based on the data that is in Amazon S3.
D. Set up an Amazon EMR cluster with EMR Fife System (EMRFS) to query the data thatis in Amazon S3. Provide access to analysts.
E. Set up an Amazon Redshift cluster to query the data that is in Amazon S3. Provideaccess to analysts.
F. Set up Amazon Athena to query the data that is in Amazon S3. Provide access toanalysts.


Question # 33

A company website hosted on Amazon EC2 instances processes classified data stored inThe application writes data to Amazon Elastic Block Store (Amazon EBS) volumes Thecompany needs to ensure that all data that is written to the EBS volumes is encrypted atrest.Which solution will meet this requirement?

A. Create an 1AM role that specifies EBS encryption Attach the role to the EC2 instances
B. Create the EBS volumes as encrypted volumes Attach the EBS volumes to the EC2instances
C. Create an EC2 instance tag that has a key of Encrypt and a value of True Tag allinstances that require encryption at the EBS level
D. Create an AWS Key Management Service (AWS KMS) key policy that enforces EBSencryption in the account Ensure that the key policy is active


Question # 34

A company has Amazon EC2 instances that run nightly batch jobs to process data. TheEC2 instances run in an Auto Scaling group that uses On-Demand billing. If a job fails onone instance: another instance will reprocess the job. The batch jobs run between 12:00AM and 06 00 AM local time every day.Which solution will provide EC2 instances to meet these requirements MOST cost-effectively'?

A. Purchase a 1-year Savings Plan for Amazon EC2 that covers the instance family of theAuto Scaling group that the batch job uses.
B. Purchase a 1-year Reserved Instance for the specific instance type and operatingsystem of the instances in the Auto Scaling group that the batch job uses.
C. Create a new launch template for the Auto Scaling group Set the instances to SpotInstances Set a policy to scale out based on CPU usage.
D. Create a new launch template for the Auto Scaling group Increase the instance size Seta policy to scale out based on CPU usage.


Question # 35

A company hosts a three-tier web application in the AWS Cloud. A Multi-AZ Amazon RDSfor MySQL server forms the database layer. Amazon ElastiCache forms the cache layer.The company wants a caching strategy that adds or updates data in the cache when acustomer adds an item to the database. The data in the cache must always match the datain the database.Which solution will meet these requirements?

A. Implement the lazy loading caching strategy
B. Implement the write-through caching strategy.
C. Implement the adding TTL caching strategy.
D. Implement the AWS AppConfig caching strategy.


Question # 36

A company wants to analyze and troubleshoot Access Denied errors and Unauthonzederrors that are related to 1AM permissions The company has AWS CloudTrail turned onWhich solution will meet these requirements with the LEAST effort?

A. Use AWS Glue and write custom scripts to query CloudTrail logs for the errors
B. Use AWS Batch and write custom scripts to query CloudTrail logs for the errors
C. Search CloudTrail logs with Amazon Athena queries to identify the errors
D. Search CloudTrail logs with Amazon QuickSight. Create a dashboard to identify the errors.


Question # 37

A global company runs its applications in multiple AWS accounts in AWS Organizations.The company's applications use multipart uploads to upload data to multiple Amazon S3buckets across AWS Regions. The company wants to report on incomplete multipartuploads for cost compliance purposes.Which solution will meet these requirements with the LEAST operational overhead?

A. Configure AWS Config with a rule to report the incomplete multipart upload object count.
B. Create a service control policy (SCP) to report the incomplete multipart upload objectcount.
C. Configure S3 Storage Lens to report the incomplete multipart upload object count.
D. Create an S3 Multi-Region Access Point to report the incomplete multipart upload objectcount.


Question # 38

A company has stored 10 TB of log files in Apache Parquet format in an Amazon S3 bucketThe company occasionally needs to use SQL to analyze the log files Which solution willmeet these requirements MOST cost-effectively?

A. Create an Amazon Aurora MySQL database Migrate the data from the S3 bucket intoAurora by using AWS Database Migration Service (AWS DMS) Issue SQL statements tothe Aurora database.
B. Create an Amazon Redshift cluster Use Redshift Spectrum to run SQL statementsdirectly on the data in the S3 bucket
C. Create an AWS Glue crawler to store and retrieve table metadata from the S3 bucketUse Amazon Athena to run SQL statements directly on the data in the S3 bucket
D. Create an Amazon EMR cluster Use Apache Spark SQL to run SQL statements directlyon the data in the S3 bucket


Question # 39

A pharmaceutical company is developing a new drug. The volume of data that the company generates has grown exponentially over the past few months. The company'sresearchers regularly require a subset of the entire dataset to be immediately available withminimal lag. However the entire dataset does not need to be accessed on a daily basis. Allthe data currently resides in on-premises storage arrays, and the company wants to reduceongoing capital expenses.Which storage solution should a solutions architect recommend to meet theserequirements?

A. Run AWS DataSync as a scheduled cron job to migrate the data to an Amazon S3bucket on an ongoing basis.
B. Deploy an AWS Storage Gateway file gateway with an Amazon S3 bucket as the targetstorage Migrate the data to the Storage Gateway appliance.
C. Deploy an AWS Storage Gateway volume gateway with cached volumes with anAmazon S3 bucket as the target storage. Migrate the data to the Storage Gatewayappliance.
D. Configure an AWS Site-to-Site VPN connection from the on-premises environment toAWS. Migrate data to an Amazon Elastic File System (Amazon EFS) file system.


Question # 40

A company runs a three-tier web application in a VPC across multiple Availability Zones.Amazon EC2 instances run in an Auto Scaling group for the application tier.The company needs to make an automated scaling plan that will analyze each resource'sdaily and weekly historical workload trends. The configuration must scale resourcesappropriately according to both the forecast and live changes in utilization.Which scaling strategy should a solutions architect recommend to meet theserequirements?

A. Implement dynamic scaling with step scaling based on average CPU utilization from theEC2 instances.
B. Enable predictive scaling to forecast and scale. Configure dynamic scaling with targettracking.
C. Create an automated scheduled scaling action based on the traffic patterns of the webapplication.
D. Set up a simple scaling policy. Increase the cooldown period based on the EC2 instancestartup time


Question # 41

A company deployed a serverless application that uses Amazon DynamoDB as a databaselayer The application has experienced a large increase in users. The company wants toimprove database response time from milliseconds to microseconds and to cache requeststo the database.Which solution will meet these requirements with the LEAST operational overhead?

A. Use DynamoDB Accelerator (DAX).
B. Migrate the database to Amazon Redshift.
C. Migrate the database to Amazon RDS.
D. Use Amazon ElastiCache for Redis.


Question # 42

An online video game company must maintain ultra-low latency for its game servers. Thegame servers run on Amazon EC2 instances. The company needs a solution that canhandle millions of UDP internet traffic requests each second.Which solution will meet these requirements MOST cost-effectively?

A. Configure an Application Load Balancer with the required protocol and ports for theinternet traffic. Specify the EC2 instances as the targets.
B. Configure a Gateway Load Balancer for the internet traffic. Specify the EC2 instances asthe targets.
C. Configure a Network Load Balancer with the required protocol and ports for the internettraffic. Specify the EC2 instances as the targets.
D. Launch an identical set of game servers on EC2 instances in separate AWS Regions. Route internet traffic to both sets of EC2 instances.


Question # 43

A company maintains an Amazon RDS database that maps users to cost centers. Thecompany has accounts in an organization in AWS Organizations. The company needs asolution that will tag all resources that are created in a specific AWS account in theorganization. The solution must tag each resource with the cost center ID of the user whocreated the resource.Which solution will meet these requirements?

A. Move the specific AWS account to a new organizational unit (OU) in Organizations fromthe management account. Create a service control policy (SCP) that requires all existingresources to have the correct cost center tag before the resources are created. Apply the SCP to the new OU.
B. Create an AWS Lambda function to tag the resources after the Lambda function looksup the appropriate cost center from the RDS database. Configure an Amazon EventBridgerule that reacts to AWS CloudTrail events to invoke the Lambda function.
C. Create an AWS CloudFormation stack to deploy an AWS Lambda function. Configurethe Lambda function to look up the appropriate cost center from the RDS database and totag resources. Create an Amazon EventBridge scheduled rule to invoke theCloudFormation stack.
D. Create an AWS Lambda function to tag the resources with a default value. Configure anAmazon EventBridge rule that reacts to AWS CloudTrail events to invoke the Lambdafunction when a resource is missing the cost center tag.


Question # 44

A company is designing a tightly coupled high performance computing (HPC) environmentin the AWS Cloud The company needs to include features that will optimize the HPCenvironment for networking and storage.Which combination of solutions will meet these requirements? (Select TWO )

A. Create an accelerator in AWS Global Accelerator. Configure custom routing for theaccelerator.
B. Create an Amazon FSx for Lustre file system. Configure the file system with scratchstorage.
C. Create an Amazon CloudFront distribution. Configure the viewer protocol policy to beHTTP and HTTPS.
D. Launch Amazon EC2 instances. Attach an Elastic Fabric Adapter (EFA) to theinstances.
E. Create an AWS Elastic Beanstalk deployment to manage the environment.


Question # 45

A company is running a photo hosting service in the us-east-1 Region. The service enablesusers across multiple countries to upload and view photos. Some photos are heavilyviewed for months, and others are viewed for less than a week. The application allowsuploads of up to 20 MB for each photo. The service uses the photo metadata to determinewhich photos to display to each user.Which solution provides the appropriate user access MOST cost-effectively?

A. Store the photos in Amazon DynamoDB. Turn on DynamoDB Accelerator (DAX) tocache frequently viewed items.
B. Store the photos in the Amazon S3 Intelligent-Tiering storage class. Store the photometadata and its S3 location in DynamoDB.
C. Store the photos in the Amazon S3 Standard storage class. Set up an S3 Lifecyclepolicy to move photos older than 30 days to the S3 Standard-Infrequent Access (S3Standard-IA) storage class. Use the object tags to keep track of metadata.
D. Store the photos in the Amazon S3 Glacier storage class. Set up an S3 Lifecycle policyto move photos older than 30 days to the S3 Glacier Deep Archive storage class. Store thephoto metadata and its S3 location in Amazon OpenSearch Service.


Question # 46

A company is designing a new web application that will run on Amazon EC2 Instances. Theapplication will use Amazon DynamoDB for backend data storage. The application trafficwill be unpredictable. T company expects that the application read and write throughput tothe database will be moderate to high. The company needs to scale in response toapplication traffic.Which DynamoDB table configuration will meet these requirements MOST cost-effectively?

A. Configure DynamoDB with provisioned read and write by using the DynamoDBStandard table class. Set DynamoDB auto scaling to a maximum defined capacity.
B. Configure DynamoDB in on-demand mode by using the DynamoDB Standard tableclass.
C. Configure DynamoDB with provisioned read and write by using the DynamoDBStandard Infrequent Access (DynamoDB Standard-IA) table class. Set DynamoDB autoscaling to a maximum defined capacity.
D. Configure DynamoDB in on-demand mode by using the DynamoDB Standard InfrequentAccess (DynamoDB Standard-IA) table class.


Question # 47

A company's web application that is hosted in the AWS Cloud recently increased inpopularity. The web application currently exists on a single Amazon EC2 instance in asingle public subnet. The web application has not been able to meet the demand of theincreased web traffic.The company needs a solution that will provide high availability and scalability to meet theincreased user demand without rewriting the web application.Which combination of steps will meet these requirements? (Select TWO.)

A. Replace the EC2 instance with a larger compute optimized instance.
B. Configure Amazon EC2 Auto Scaling with multiple Availability Zones in private subnets.
C. Configure a NAT gateway in a public subnet to handle web requests.
D. Replace the EC2 instance with a larger memory optimized instance.
E. Configure an Application Load Balancer in a public subnet to distribute web traffic


Question # 48

A company is designing a web application on AWS The application will use a VPNconnection between the company's existing data centers and the company's VPCs. Thecompany uses Amazon Route 53 as its DNS service. The application must use privateDNS records to communicate with the on-premises services from a VPC. Which solutionwill meet these requirements in the MOST secure manner?

A. Create a Route 53 Resolver outbound endpoint. Create a resolver rule. Associate theresolver rule with the VPC
B. Create a Route 53 Resolver inbound endpoint. Create a resolver rule. Associate theresolver rule with the VPC.
C. Create a Route 53 private hosted zone. Associate the private hosted zone with the VPC.
D. Create a Route 53 public hosted zone. Create a record for each service to allow servicecommunication.


Question # 49

A media company stores movies in Amazon S3. Each movie is stored in a single video filethat ranges from 1 GB to 10 GB in size.The company must be able to provide the streaming content of a movie within 5 minutes ofa user purchase. There is higher demand for movies that are less than 20 years old thanfor movies that are more than 20 years old. The company wants to minimize hostingservice costs based on demand.Which solution will meet these requirements?

A. Store all media content in Amazon S3. Use S3 Lifecycle policies to move media datainto the Infrequent Access tier when the demand for a movie decreases.
B. Store newer movie video files in S3 Standard Store older movie video files in S3Standard-Infrequent Access (S3 Standard-IA). When a user orders an older movie, retrievethe video file by using standard retrieval.
C. Store newer movie video files in S3 Intelligent-Tiering. Store older movie video files inS3 Glacier Flexible Retrieval. When a user orders an older movie, retrieve the video file byusing expedited retrieval.
D. Store newer movie video files in S3 Standard. Store older movie video files in S3 GlacierFlexible Retrieval. When a user orders an older movie, retrieve the video file by using bulkretrieval.


Question # 50

A business application is hosted on Amazon EC2 and uses Amazon S3 for encryptedobject storage. The chief information security officer has directed that no application trafficbetween the two services should traverse the public internet.Which capability should the solutions architect use to meet the compliance requirements?

A. AWS Key Management Service (AWS KMS)
B. VPC endpoint
C. Private subnet
D. Virtual private gateway


Question # 51

To meet security requirements, a company needs to encrypt all of its application data intransit while communicating with an Amazon RDS MySQL DB instance. A recent securityaudit revealed that encryption at rest is enabled using AWS Key Management Service(AWS KMS), but data in transit is not enabled.What should a solutions architect do to satisfy the security requirements?

A. Enable 1AM database authentication on the database.
B. Provide self-signed certificates. Use the certificates in all connections to the RDSinstance.
C. Take a snapshot of the RDS instance. Restore the snapshot to a new instance withencryption enabled.
D. Download AWS-provided root certificates. Provide the certificates in all connections tothe RDS instance.


Question # 52

A company stores text files in Amazon S3. The text files include customer chat messages,date and time information, and customer personally identifiable information (Pll).The company needs a solution to provide samples of the conversations to an externalservice provider for quality control. The external service provider needs to randomly picksample conversations up to the most recent conversation. The company must not sharethe customer Pll with the external service provider. The solution must scale when thenumber of customer conversations increases.Which solution will meet these requirements with the LEAST operational overhead?

A. Create an Object Lambda Access Point. Create an AWS Lambda function that redactsthe Pll when the function reads the file. Instruct the external service provider to access theObject Lambda Access Point.
B. Create a batch process on an Amazon EC2 instance that regularly reads all new files,redacts the Pll from the files, and writes the redacted files to a different S3 bucket. Instructthe external service provider to access the bucket that does not contain the Pll.
C. Create a web application on an Amazon EC2 instance that presents a list of the files,redacts the Pll from the files, and allows the external service provider to download newversions of the files that have the Pll redacted.
D. Create an Amazon DynamoDB table. Create an AWS Lambda function that reads onlythe data in the files that does not contain Pll. Configure the Lambda function to store thenon-PII data in the DynamoDB table when a new file is written to Amazon S3. Grant theexternal service provider access to the DynamoDB table.


Question # 53

A company wants to deploy its containerized application workloads to a VPC across threeAvailability Zones. The company needs a solution that is highly available across AvailabilityZones. The solution must require minimal changes to the application.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon Elastic Container Service (Amazon ECS). Configure Amazon ECS ServiceAuto Scaling to use target tracking scaling. Set the minimum capacity to 3. Set the taskplacement strategy type to spread with an Availability Zone attribute.
B. Use Amazon Elastic Kubernetes Service (Amazon EKS) self-managed nodes. ConfigureApplication Auto Scaling to use target tracking scaling. Set the minimum capacity to 3.
C. Use Amazon EC2 Reserved Instances. Launch three EC2 instances in a spreadplacement group. Configure an Auto Scaling group to use target tracking scaling. Set theminimum capacity to 3.
D. Use an AWS Lambda function. Configure the Lambda function to connect to a VPC.Configure Application Auto Scaling to use Lambda as a scalable target. Set the minimumcapacity to 3.


Question # 54

A company needs to use its on-premises LDAP directory service to authenticate its usersto the AWS Management Console. The directory service is not compatible with SecurityAssertion Markup Language (SAML).Which solution meets these requirements?

A. Enable AWS 1AM Identity Center (AWS Single Sign-On) between AWS and the onpremisesLDAP.
B. Create an 1AM policy that uses AWS credentials, and integrate the policy into LDAP.
C. Set up a process that rotates the I AM credentials whenever LDAP credentials areupdated.
D. Develop an on-premises custom identity broker application or process that uses AWSSecurity Token Service (AWS STS) to get short-lived credentials.


Question # 55

A company wants to migrate its on-premises Microsoft SQL Server Enterprise editiondatabase to AWS. The company's online application uses the database to processtransactions. The data analysis team uses the same production database to run reports foranalytical processing. The company wants to reduce operational overhead by moving tomanaged services wherever possible.Which solution will meet these requirements with the LEAST operational overhead?

A. Migrate to Amazon RDS for Microsoft SQL Server. Use read replicas for reportingpurposes.
B. Migrate to Microsoft SQL Server on Amazon EC2. Use Always On read replicas forreporting purposes.
C. Migrate to Amazon DynamoDB. Use DynamoDB on-demand replicas for reportingpurposes.
D. Migrate to Amazon Aurora MySQL. Use Aurora read replicas for reporting purposes.


Question # 56

A company's website is used to sell products to the public. The site runs on Amazon EC2instances in an Auto Scaling group behind an Application Load Balancer (ALB). There isalso an Amazon CloudFront distribution, and AWS WAF is being used to protect againstSQL injection attacks. The ALB is the origin for the CloudFront distribution. A recent reviewof security logs revealed an external malicious IP that needs to be blocked from accessingthe website.What should a solutions architect do to protect the application?

A. Modify the network ACL on the CloudFront distribution to add a deny rule for themalicious IP address.
B. Modify the configuration of AWS WAF to add an IP match condition to block themalicious IP address.
C. Modify the network ACL for the EC2 instances in the target groups behind the ALB todeny the malicious IP address.
D. Modify the security groups for the EC2 instances in the target groups behind the ALB todeny the malicious IP address.


Question # 57

A company has a web application for travel ticketing. The application is based on adatabase that runs in a single data center in North America. The company wants to expandthe application to serve a global user base. The company needs to deploy the applicationto multiple AWS Regions. Average latency must be less than 1 second on updates to thereservation database.The company wants to have separate deployments of its web platform across multipleRegions. However the company must maintain a single primary reservation database thatis globally consistent.Which solution should a solutions architect recommend to meet these requirements?

A. Convert the application to use Amazon DynamoDB. Use a global table for the centerreservation table. Use the correct Regional endpoint in each Regional deployment.
B. Migrate the database to an Amazon Aurora MySQL database. Deploy Aurora ReadReplicas in each Region. Use the correct Regional endpoint in each Regional deploymentfor access to the database.
C. Migrate the database to an Amazon RDS for MySQL database Deploy MySQL readreplicas in each Region. Use the correct Regional endpoint in each Regional deploymentfor access to the database.
D. Migrate the application to an Amazon Aurora Serverless database. Deploy instances ofthe database to each Region. Use the correct Regional endpoint in each Regionaldeployment to access the database. Use AWS Lambda functions to process event streamsin each Region to synchronize the databases.


Question # 58

A company has an application that uses an Amazon DynamoDB table for storage. Asolutions architect discovers that many requests to the table are not returning the latestdata. The company's users have not reported any other issues with database performance.Latency is in an acceptable range.Which design change should the solutions architect recommend?

A. Add read replicas to the table.
B. Use a global secondary index (GSI).
C. Request strongly consistent reads for the table.
D. Request eventually consistent reads for the table.


Question # 59

A company has an AWS Direct Connect connection from its corporate data center to itsVPC in the us-east-1 Region. The company recently acquired a corporation that hasseveral VPCs and a Direct Connect connection between its on-premises data center andthe eu-west-2 Region. The CIDR blocks for the VPCs of the company and the corporationdo not overlap. The company requires connectivity between two Regions and the datacenters. The company needs a solution that is scalable while reducing operationaloverhead. What should a solutions architect do to meet these requirements?

A. Set up inter-Region VPC peering between the VPC in us-east-1 and the VPCs in euwest-2.
B. Create private virtual interfaces from the Direct Connect connection in us-east-1 to theVPCs in eu-west-2.
C. Establish VPN appliances in a fully meshed VPN network hosted by Amazon EC2. UseAWS VPN CloudHub to send and receive data between the data centers and each VPC.
D. Connect the existing Direct Connect connection to a Direct Connect gateway. Routetraffic from the virtual private gateways of the VPCs in each Region to the Direct Connectgateway.


Question # 60

A company has five organizational units (OUs) as part of its organization in AWSOrganizations. Each OU correlates to the five businesses that the company owns. Thecompany's research and development (R&D) business is separating from the company andwill need its own organization. A solutions architect creates a separate new managementaccount for this purpose.What should the solutions architect do next in the new management account?

A. Have the R&D AWS account be part of both organizations during the transition.
B. Invite the R&D AWS account to be part of the new organization after the R&D AWSaccount has left the prior organization.
C. Create a new R&D AWS account in the new organization. Migrate resources from theprior R&D AWS account to the new R&D AWS account.
D. Have the R&D AWS account join the new organization. Make the new managementaccount a member of the prior organization.


Question # 61

A company runs a real-time data ingestion solution on AWS. The solution consists of themost recent version of Amazon Managed Streaming for Apache Kafka (Amazon MSK). Thesolution is deployed in a VPC in private subnets across three Availability Zones.A solutions architect needs to redesign the data ingestion solution to be publicly availableover the internet. The data in transit must also be encrypted.Which solution will meet these requirements with the MOST operational efficiency

A. Configure public subnets in the existing VPC. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
B. Create a new VPC that has public subnets. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
C. Deploy an Application Load Balancer (ALB) that uses private subnets. Configure an ALBsecurity group inbound rule to allow inbound traffic from the VPC CIDR block for HTTPSprotocol.
D. Deploy a Network Load Balancer (NLB) that uses private subnets. Configure an NLBlistener for HTTPS communication over the internet.


Question # 62

A company is building a shopping application on AWS. The application offers a catalog thatchanges once each month and needs to scale with traffic volume. The company wants thelowest possible latency from the application. Data from each user's shopping carl needs tobe highly available. User session data must be available even if the user is disconnectedand reconnects.What should a solutions architect do to ensure that the shopping cart data is preserved atall times?

A. Configure an Application Load Balancer to enable the sticky sessions feature (sessionaffinity) for access to the catalog in Amazon Aurora.
B. Configure Amazon ElastiCacJie for Redis to cache catalog data from AmazonDynamoDB and shopping carl data from the user's session.
C. Configure Amazon OpenSearch Service to cache catalog data from AmazonDynamoDB and shopping cart data from the user's session.
D. Configure an Amazon EC2 instance with Amazon Elastic Block Store (Amazon EBS)storage for the catalog and shopping cart. Configure automated snapshots.


Question # 63

A company has deployed a multiplayer game for mobile devices. The game requires livelocation tracking of players based on latitude and longitude. The data store for the gamemust support rapid updates and retrieval of locations.The game uses an Amazon RDS for PostgreSQL DB instance with read replicas to storethe location data. During peak usage periods, the database is unable to maintain theperformance that is needed for reading and writing updates. The game's user base isincreasing rapidly.What should a solutions architect do to improve the performance of the data tier?

A. Take a snapshot of the existing DB instance. Restore the snapshot with Multi-AZenabled.
B. Migrate from Amazon RDS to Amazon OpenSearch Service with OpenSearchDashboards.
C. Deploy Amazon DynamoDB Accelerator (DAX) in front of the existing DB instance.Modify the game to use DAX.
D. Deploy an Amazon ElastiCache for Redis cluster in front of the existing DB instance.Modify the game to use Redis.


Question # 64

A company wants to run its experimental workloads in the AWS Cloud. The company has abudget for cloud spending. The company's CFO is concerned about cloud spendingaccountability for each department. The CFO wants to receive notification when thespending threshold reaches 60% of the budget.Which solution will meet these requirements?

A. Use cost allocation tags on AWS resources to label owners. Create usage budgets inAWS Budgets. Add an alert threshold to receive notification when spending exceeds 60%of the budget.
B. Use AWS Cost Explorer forecasts to determine resource owners. Use AWS CostAnomaly Detection to create alert threshold notifications when spending exceeds 60% ofthe budget.
C. Use cost allocation tags on AWS resources to label owners. Use AWS Support API onAWS Trusted Advisor to create alert threshold notifications when spending exceeds 60% ofthe budget
D. Use AWS Cost Explorer forecasts to determine resource owners. Create usage budgetsin AWS Budgets. Add an alert threshold to receive notification when spending exceeds60% of the budget.


Question # 65

A city has deployed a web application running on Amazon EC2 instances behind anApplication Load Balancer (ALB). The application's users have reported sporadicperformance, which appears to be related to DDoS attacks originating from random IPaddresses. The city needs a solution that requires minimal configuration changes and provides an audit trail for the DDoS sources. Which solution meets these requirements?

A. Enable an AWS WAF web ACL on the ALB, and configure rules to block traffic fromunknown sources.
B. Subscribe to Amazon Inspector. Engage the AWS DDoS Response Team (DRT) tointegrate mitigating controls into the service.
C. Subscribe to AWS Shield Advanced. Engage the AWS DDoS Response Team (DRT) tointegrate mitigating controls into the service.
D. Create an Amazon CloudFront distribution for the application, and set the ALB as theorigin. Enable an AWS WAF web ACL on the distribution, and configure rules to blocktraffic from unknown sources.


Question # 66

A company runs a web application on Amazon EC2 instances in an Auto Scaling group thathas a target group. The company desgned the application to work with session affinity(sticky sessions) for a better user experience.The application must be available publicly over the internet as an endpoint_ A WAF mustbe applied to the endpoint for additional security. Session affinity (sticky sessions) must beconfigured on the endpointWhich combination of steps will meet these requirements? (Select TWO)

A. Create a public Network Load Balancer Specify the application target group.
B. Create a Gateway Load Balancer Specify the application target group.
C. Create a public Application Load Balancer Specify the application target group.
D. Create a second target group. Add Elastic IP addresses to the EC2 instances
E. Create a web ACL in AWS WAF Associate the web ACL with the endpoint


Question # 67

A security audit reveals that Amazon EC2 instances are not being patched regularly. Asolutions architect needs to provide a solution that will run regular security scans across alarge fleet of EC2 instances. The solution should also patch the EC2 instances on a regularschedule and provide a report of each instance's patch status.Which solution will meet these requirements?

A. Set up Amazon Macie to scan the EC2 instances for software vulnerabilities. Set up acron job on each EC2 instance to patch the instance on a regular schedule.
B. Turn on Amazon GuardDuty in the account. Configure GuardDuty to scan the EC2instances for software vulnerabilities. Set up AWS Systems Manager Session Manager topatch the EC2 instances on a regular schedule.
C. Set up Amazon Detective to scan the EC2 instances for software vulnerabilities. Set upan Amazon EventBridge scheduled rule to patch the EC2 instances on a regular schedule.
D. Turn on Amazon Inspector in the account. Configure Amazon Inspector to scan the EC2instances for software vulnerabilities. Set up AWS Systems Manager Patch Manager topatch the EC2 instances on a regular schedule.


Question # 68

A manufacturing company runs its report generation application on AWS. The applicationgenerates each report in about 20 minutes. The application is built as a monolith that runson a single Amazon EC2 instance. The application requires frequent updates to its tightlycoupled modules. The application becomes complex to maintain as the company adds newfeatures.Each time the company patches a software module, the application experiences downtime.Report generation must restart from the beginning after any interruptions. The companywants to redesign the application so that the application can be flexible, scalable, andgradually improved. The company wants to minimize application downtime.Which solution will meet these requirements?

A. Run the application on AWS Lambda as a single function with maximum provisionedconcurrency.
B. Run the application on Amazon EC2 Spot Instances as microservices with a Spot Fleetdefault allocation strategy.
C. Run the application on Amazon Elastic Container Service (Amazon ECS) asmicroservices with service auto scaling.
D. Run the application on AWS Elastic Beanstalk as a single application environment withan all-at-once deployment strategy.


Question # 69

A company uses Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS)volumes to run an application. The company creates one snapshot of each EBS volumeevery day to meet compliance requirements. The company wants to implement anarchitecture that prevents the accidental deletion of EBS volume snapshots. The solutionmust not change the administrative rights of the storage administrator user.Which solution will meet these requirements with the LEAST administrative effort?

A. Create an 1AM role that has permission to delete snapshots. Attach the role to a newEC2 instance. Use the AWS CLI from the new EC2 instance to delete snapshots.
B. Create an 1AM policy that denies snapshot deletion. Attach the policy to the storageadministrator user.
C. Add tags to the snapshots. Create retention rules in Recycle Bin for EBS snapshots thathave the tags.
D. Lock the EBS snapshots to prevent deletion.


Question # 70

A company is deploying a new application to Amazon Elastic Kubernetes Service (AmazonEKS) with an AWS Fargate cluster. The application needs a storage solution for datapersistence. The solution must be highly available and fault tolerant. The solution also mustbe shared between multiple application containers.Which solution will meet these requirements with the LEAST operational overhead?

A. Create Amazon Elastic Block Store (Amazon EBS) volumes in the same AvailabilityZones where EKS worker nodes are placed. Register the volumes in a StorageClass objecton an EKS cluster. Use EBS Multi-Attach to share the data between containers.
B. Create an Amazon Elastic File System (Amazon EFS) file system. Register the filesystem in a StorageClass object on an EKS cluster. Use the same file system for allcontainers.
C. Create an Amazon Elastic Block Store (Amazon EBS) volume. Register the volume in aStorageClass object on an EKS cluster. Use the same volume for all containers.
D. Create Amazon Elastic File System (Amazon EFS) file systems in the same AvailabilityZones where EKS worker nodes are placed. Register the file systems in a StorageClass object on an EKS cluster. Create an AWS Lambda function to synchronize the databetween file systems.


Question # 71

A company has NFS servers in an on-premises data center that need to periodically backup small amounts of data to Amazon S3. Which solution meets these requirements and isMOST cost-effective?

A. Set up AWS Glue to copy the data from the on-premises servers to Amazon S3.
B. Set up an AWS DataSync agent on the on-premises servers, and sync the data toAmazon S3.
C. Set up an SFTP sync using AWS Transfer for SFTP to sync data from on premises toAmazon S3.
D. Set up an AWS Direct Connect connection between the on-premises data center and aVPC, and copy the data to Amazon S3.


Question # 72

A company has an application that uses Docker containers in its local data center Theapplication runs on a container host that stores persistent data in a volume on the host.The container instances use the stored persistent data.The company wants to move the application to a fully managed service because thecompany does not want to manage any servers or storage infrastructure.Which solution will meet these requirements?

A. Use Amazon Elastic Kubernetes Service (Amazon EKS) with self-managed nodes.Create an Amazon Elastic Block Store (Amazon EBS) volume attached to an Amazon EC2instance. Use the EBS volume as a persistent volume mounted in the containers.
B. Use Amazon Elastic Container Service (Amazon ECS) with an AWS Fargate launchtype. Create an Amazon Elastic File System (Amazon EFS) volume. Add the EFS volumeas a persistent storage volume mounted in the containers.
C. Use Amazon Elastic Container Service (Amazon ECS) with an AWS Fargate launchtype. Create an Amazon S3 bucket. Map the S3 bucket as a persistent storage volumemounted in the containers.
D. Use Amazon Elastic Container Service (Amazon ECS) with an Amazon EC2 launchtype. Create an Amazon Elastic File System (Amazon EFS) volume. Add the EFS volumeas a persistent storage volume mounted in the containers.


Question # 73

A company stores critical data in Amazon DynamoDB tables in the company's AWSaccount. An IT administrator accidentally deleted a DynamoDB table. The deletion caused a significant loss of data and disrupted the company's operations. The company wants toprevent this type of disruption in the future.Which solution will meet this requirement with the LEAST operational overhead?

A. Configure a trail in AWS CloudTrail. Create an Amazon EventBridge rule for deleteactions. Create an AWS Lambda function to automatically restore deleted DynamoDBtables.
B. Create a backup and restore plan for the DynamoDB tables. Recover the DynamoDBtables manually.
C. Configure deletion protection on the DynamoDB tables.
D. Enable point-in-time recovery on the DynamoDB tables.


Question # 74

A company hosts multiple applications on AWS for different product lines. The applicationsuse different compute resources, including Amazon EC2 instances and Application LoadBalancers. The applications run in different AWS accounts under the same organization inAWS Organizations across multiple AWS Regions. Teams for each product line havetagged each compute resource in the individual accounts.The company wants more details about the cost for each product line from the consolidated billing feature in Organizations.Which combination of steps will meet these requirements? (Select TWO.)

A. Select a specific AWS generated tag in the AWS Billing console.
B. Select a specific user-defined tag in the AWS Billing console.
C. Select a specific user-defined tag in the AWS Resource Groups console.
D. Activate the selected tag from each AWS account.
E. Activate the selected tag from the Organizations management account.


Question # 75

A solutions architect needs to ensure that API calls to Amazon DynamoDB from AmazonEC2 instances in a VPC do not travel across the internet.Which combination of steps should the solutions architect take to meet this requirement?(Choose two.)

A. Create a route table entry for the endpoint.
B. Create a gateway endpoint for DynamoDB.
C. Create an interface endpoint for Amazon EC2.
D. Create an elastic network interface for the endpoint in each of the subnets of the VPC.
E. Create a security group entry in the endpoint's security group to provide access.


Question # 76

A company hosts a data lake on Amazon S3. The data lake ingests data in ApacheParquet format from various data sources. The company uses multiple transformationsteps to prepare the ingested data. The steps include filtering of anomalies, normalizing ofdata to standard date and time values, and generation of aggregates for analyses.The company must store the transformed data in S3 buckets that data analysts access.The company needs a prebuilt solution for data transformation that does not require code.The solution must provide data lineage and data profiling. The company needs to share thedata transformation steps with employees throughout the company.Which solution will meet these requirements?

A. Configure an AWS Glue Studio visual canvas to transform the data. Share thetransformation steps with employees by using AWS Glue jobs.
B. Configure Amazon EMR Serverless to transform the data. Share the transformationsteps with employees by using EMR Serveriess jobs.
C. Configure AWS Glue DataBrew to transform the data. Share the transformation stepswith employees by using DataBrew recipes.
D. Create Amazon Athena tables for the data. Write Athena SQL queries to transform thedata. Share the Athena SQL queries with employees.


Question # 77

A company is using an Application Load Balancer (ALB) to present its application to theinternet. The company finds abnormal traffic access patterns across the application. Asolutions architect needs to improve visibility into the infrastructure to help the companyunderstand these abnormalities better.What is the MOST operationally efficient solution that meets these requirements?

A. Create a table in Amazon Athena for AWS CloudTrail logs. Create a query for therelevant information.
B. Enable ALB access logging to Amazon S3. Create a table in Amazon Athena, and querythe logs.
C. Enable ALB access logging to Amazon S3 Open each file in a text editor, and searcheach line for the relevant information
D. Use Amazon EMR on a dedicated Amazon EC2 instance to directly query the ALB toacquire traffic access log information.


Question # 78

A company copies 200 TB of data from a recent ocean survey onto AWS Snowball EdgeStorage Optimized devices. The company has a high performance computing (HPC)cluster that is hosted on AWS to look for oil and gas deposits. A solutions architect mustprovide the cluster with consistent sub-millisecond latency and high-throughput access to the data on the Snowball Edge Storage Optimized devices. The company is sending thedevices back to AWS.Which solution will meet these requirements?

A. Create an Amazon S3 bucket. Import the data into the S3 bucket. Configure an AWSStorage Gateway file gateway to use the S3 bucket. Access the file gateway from the HPCcluster instances.
B. Create an Amazon S3 bucket. Import the data into the S3 bucket. Configure an AmazonFSx for Lustre file system, and integrate it with the S3 bucket. Access the FSx for Lustrefile system from the HPC cluster instances.
C. Create an Amazon S3 bucket and an Amazon Elastic File System (Amazon EFS) filesystem. Import the data into the S3 bucket. Copy the data from the S3 bucket to the EFSfile system. Access the EFS file system from the HPC cluster instances.
D. Create an Amazon FSx for Lustre file system. Import the data directly into the FSx forLustre file system. Access the FSx for Lustre file system from the HPC cluster instances.


Question # 79

A gaming company wants to launch a new internet-facing application in multiple AWSRegions The application will use the TCP and UDP protocols for communication. Thecompany needs to provide high availability and minimum latency for global users.Which combination of actions should a solutions architect take to meet theserequirements? (Select TWO.)

A. Create internal Network Load Balancers in front of the application in each Region.
B. Create external Application Load Balancers in front of the application in each Region.
C. Create an AWS Global Accelerator accelerator to route traffic to the load balancers ineach Region.
D. Configure Amazon Route 53 to use a geolocation routing policy to distribute the traffic.
E. Configure Amazon CloudFront to handle the traffic and route requests to the applicationin each Region.


Question # 80

A company's application runs on Amazon EC2 instances that are in multiple AvailabilityZones. The application needs to ingest real-time data from third-party applications.The company needs a data ingestion solution that places the ingested raw data in anAmazon S3 bucket.Which solution will meet these requirements?

A. Create Amazon Kinesis data streams for data ingestion. Create Amazon Kinesis DataFirehose delivery streams to consume the Kinesis data streams. Specify the S3 bucket asthe destination of the delivery streams.
B. Create database migration tasks in AWS Database Migration Service (AWS DMS).Specify replication instances of the EC2 instances as the source endpoints. Specify the S3bucket as the target endpoint. Set the migration type to migrate existing data and replicateongoing changes.
C. Create and configure AWS DataSync agents on the EC2 instances. Configure DataSynctasks to transfer data from the EC2 instances to the S3 bucket.
D. Create an AWS Direct Connect connection to the application for data ingestion. CreateAmazon Kinesis Data Firehose delivery streams to consume direct PUT operations from the application. Specify the S3 bucket as the destination of the delivery streams.


Question # 81

A financial services company wants to shut down two data centers and migrate more than100 TB of data to AWS. The data has an intricate directory structure with millions of smallfiles stored in deep hierarchies of subfolders. Most of the data is unstructured, and thecompany's file storage consists of SMB-based storage types from multiple vendors. Thecompany does not want to change its applications to access the data after migration.What should a solutions architect do to meet these requirements with the LEASToperational overhead?

A. Use AWS Direct Connect to migrate the data to Amazon S3.
B. Use AWS DataSync to migrate the data to Amazon FSx for Lustre.
C. Use AWS DataSync to migrate the data to Amazon FSx for Windows File Server.
D. Use AWS Direct Connect to migrate the data on-premises file storage to an AWSStorage Gateway volume gateway.


Question # 82

An loT company is releasing a mattress that has sensors to collect data about a user'ssleep. The sensors will send data to an Amazon S3 bucket. The sensors collectapproximately 2 MB of data every night for each mattress. The company must process andsummarize the data for each mattress. The results need to be available as soon aspossible Data processing will require 1 GB of memory and will finish within 30 seconds.Which solution will meet these requirements MOST cost-effectively?

A. Use AWS Glue with a Scalajob.
B. Use Amazon EMR with an Apache Spark script.
C. Use AWS Lambda with a Python script.
D. Use AWS Glue with a PySpark job.


Question # 83

A company plans to migrate toAWS and use Amazon EC2 On-Demand Instances for itsapplication. During the migration testing phase, a technical team observes that theapplication takes a long time to launch and load memory to become fully productive.Which solution will reduce the launch time of the application during the next testing phase?

A. Launch two or more EC2 On-Demand Instances. Turn on auto scaling features andmake the EC2 On-Demand Instances available during the next testing phase.
B. Launch EC2 Spot Instances to support the application and to scale the application so itis available during the next testing phase.
C. Launch the EC2 On-Demand Instances with hibernation turned on. Configure EC2 AutoScaling warm pools during the next testing phase.
D. Launch EC2 On-Demand Instances with Capacity Reservations. Start additional EC2instances during the next testing phase.


Question # 84

A company runs an application on AWS. The application receives inconsistent amounts ofusage. The application uses AWS Direct Connect to connect to an on-premises MySQLcompatibledatabase. The on-premises database consistently uses a minimum of 2 GiB ofmemory.The company wants to migrate the on-premises database to a managed AWS service. Thecompany wants to use auto scaling capabilities to manage unexpected workload increases.Which solution will meet these requirements with the LEAST administrative overhead?

A. Provision an Amazon DynamoDB database with default read and write capacity settings.
B. Provision an Amazon Aurora database with a minimum capacity of 1 Aurora capacityunit (ACU).
C. Provision an Amazon Aurora Serverless v2 database with a minimum capacity of 1Aurora capacity unit (ACU).
D. Provision an Amazon RDS for MySQL database with 2 GiB of memory.


Question # 85

A company uses AWS Organizations. The company wants to operate some of its AWSaccounts with different budgets. The company wants to receive alerts and automaticallyprevent provisioning of additional resources on AWS accounts when the allocated budgetthreshold is met during a specific period.Which combination of solutions will meet these requirements? (Select THREE.)

A. Use AWS Budgets to create a budget. Set the budget amount under the Cost andUsage Reports section of the required AWS accounts.
B. Use AWS Budgets to create a budget. Set the budget amount under the Billingdashboards of the required AWS accounts.
C. Create an 1AM user for AWS Budgets to run budget actions with the requiredpermissions.
D. Create an 1AM role for AWS Budgets to run budget actions with the requiredpermissions.
E. Add an alert to notify the company when each account meets its budget threshold. Adda budget action that selects the 1AM identity created with the appropriate config rule toprevent provisioning of additional resources.
F. Add an alert to notify the company when each account meets its budget threshold. Add abudget action that selects the 1AM identity created with the appropriate service controlpolicy (SCP) to prevent provisioning of additional resources.


Question # 86

A recent analysis of a company's IT expenses highlights the need to reduce backup costs.The company's chief information officer wants to simplify the on- premises backupinfrastructure and reduce costs by eliminating the use of physical backup tapes. Thecompany must preserve the existing investment in the on- premises backup applicationsand workflows.What should a solutions architect recommend?

A. Set up AWS Storage Gateway to connect with the backup applications using the NFSinterface.
B. Set up an Amazon EFS file system that connects with the backup applications using theNFS interface.
C. Set up an Amazon EFS file system that connects with the backup applications using theiSCSI interface.
D. Set up AWS Storage Gateway to connect with the backup applications using the iSCSIvirtualtape library (VTL) interface.


Question # 87

A company is migrating its multi-tier on-premises application to AWS. The applicationconsists of a single-node MySQL database and a multi-node web tier. The company mustminimize changes to the application during the migration. The company wants to improveapplication resiliency after the migration.Which combination of steps will meet these requirements? (Select TWO.)

A. Migrate the web tier to Amazon EC2 instances in an Auto Scaling group behind anApplication Load Balancer.
B. Migrate the database to Amazon EC2 instances in an Auto Scaling group behind aNetwork Load Balancer.
C. Migrate the database to an Amazon RDS Multi-AZ deployment.
D. Migrate the web tier to an AWS Lambda function.
E. Migrate the database to an Amazon DynamoDB table.


Question # 88

A company runs its applications on Amazon EC2 instances that are backed by AmazonElastic Block Store (Amazon EBS). The EC2 instances run the most recent Amazon Linuxrelease. The applications are experiencing availability issues when the company's employees store and retrieve files that are 25 GB or larger. The company needs a solutionthat does not require the company to transfer files between EC2 instances. The files mustbe available across many EC2 instances and across multiple Availability Zones.Which solution will meet these requirements?

A. Migrate all the files to an Amazon S3 bucket. Instruct the employees to access the filesfrom the S3 bucket.
B. Take a snapshot of the existing EBS volume. Mount the snapshot as an EBS volumeacross the EC2 instances. Instruct the employees to access the files from the EC2instances.
C. Mount an Amazon Elastic File System (Amazon EFS) file system across all the EC2instances. Instruct the employees to access the files from the EC2 instances.
D. Create an Amazon Machine Image (AMI) from the EC2 instances. Configure new EC2instances from the AMI that use an instance store volume. Instruct the employees toaccess the files from the EC2 instances


Question # 89

A company has an application that delivers on-demand training videos to students aroundthe world. The application also allows authorized content developers to upload videos. Thedata is stored in an Amazon S3 bucket in the us-east-2 Region.The company has created an S3 bucket in the eu-west-2 Region and an S3 bucket in theap-southeast-1 Region. The company wants to replicate the data to the new S3 buckets.The company needs to minimize latency for developers who upload videos and studentswho stream videos near eu-west-2 and ap-southeast-1. Which combination of steps will meet these requirements with the FEWEST changes to theapplication? (Select TWO.)

A. Configure one-way replication from the us-east-2 S3 bucket to the eu-west-2 S3 bucket.Configure one-way replication from the us-east-2 S3 bucket to the ap-southeast-1 S3bucket.
B. Configure one-way replication from the us-east-2 S3 bucket to the eu-west-2 S3 bucket.Configure one-way replication from the eu-west-2 S3 bucket to the ap-southeast-1 S3bucket.
C. Configure two-way (bidirectional) replication among the S3 buckets that are in all threeRegions.
D. Create an S3 Multi-Region Access Point. Modify the application to use the AmazonResource Name (ARN) of the Multi-Region Access Point for video streaming. Do notmodify the application for video uploads.
E. Create an S3 Multi-Region Access Point Modify the application to use the AmazonResource Name (ARN) of the Multi-Region Access Point for video streaming and uploads.


Question # 90

A company has 150 TB of archived image data stored on-premises that needs to be movedto the AWS Cloud within the next month. The company's current network connection allowsup to 100 Mbps uploads for this purpose during the night only.What is the MOST cost-effective mechanism to move this data and meet the migrationdeadline?

A. Use AWS Snowmobile to ship the data to AWS.
B. Order multiple AWS Snowball devices to ship the data to AWS.
C. Enable Amazon S3 Transfer Acceleration and securely upload the data.
D. Create an Amazon S3 VPC endpoint and establish a VPN to upload the data.


Question # 91

A company has an on-premises MySQL database that handles transactional data. Thecompany is migrating the database to the AWS Cloud. The migrated database mustmaintain compatibility with the company's applications that use the database. The migrateddatabase also must scale automatically during periods of increased demand.Which migration solution will meet these requirements?

A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL. Configureelastic storage scaling.
B. Migrate the database to Amazon Redshift by using the mysqldump utility. Turn on AutoScaling for the Amazon Redshift cluster.
C. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonAurora. Turn on Aurora Auto Scaling.
D. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonDynamoDB. Configure an Auto Scaling policy.


Question # 92

A solutions architect wants to use the following JSON text as an identity-based policy togrant specific permissions: Which IAM principals can the solutions architect attach this policy to? (Select TWO.)

A. Role
B. Group
C. Organization
D. Amazon Elastic Container Service (Amazon ECS) resource
E. Amazon EC2 resource


Question # 93

A company that uses AWS needs a solution to predict the resources needed formanufacturing processes each month. The solution must use historical values that arecurrently stored in an Amazon S3 bucket The company has no machine learning (ML)experience and wants to use a managed service for the training and predictions.Which combination of steps will meet these requirements? (Select TWO.)

A. Deploy an Amazon SageMaker model. Create a SageMaker endpoint for inference.
B. Use Amazon SageMaker to train a model by using the historical data in the S3 bucket.
C. Configure an AWS Lambda function with a function URL that uses Amazon SageMakerendpoints to create predictions based on the inputs.
D. Configure an AWS Lambda function with a function URL that uses an Amazon Forecastpredictor to create a prediction based on the inputs.
E. Train an Amazon Forecast predictor by using the historical data in the S3 bucket.


Question # 94

A company is running a legacy system on an Amazon EC2 instance. The application codecannot be modified, and the system cannot run on more than one instance. A solutionsarchitect must design a resilient solution that can improve the recovery time for the system.What should the solutions architect recommend to meet these requirements?

A. Enable termination protection for the EC2 instance.
B. Configure the EC2 instance for Multi-AZ deployment.
C. Create an Amazon CloudWatch alarm to recover the EC2 instance in case of failure.
D. Launch the EC2 instance with two Amazon Elastic Block Store (Amazon EBS) volumesthat use RAID configurations for storage redundancy.


Question # 95

A company has applications that run on Amazon EC2 instances. The EC2 instancesconnect to Amazon RDS databases by using an 1AM role that has associated policies. Thecompany wants to use AWS Systems Manager to patch the EC2 instances withoutdisrupting the running applications.Which solution will meet these requirements?

A. Create a new 1AM role. Attach the AmazonSSMManagedlnstanceCore policy to thenew 1AM role. Attach the new 1AM role to the EC2 instances and the existing 1AM role.
B. Create an 1AM user. Attach the AmazonSSMManagedlnstanceCore policy to the 1AMuser. Configure Systems Manager to use the 1AM user to manage the EC2 instances.
C. Enable Default Host Configuration Management in Systems Manager to manage theEC2 instances.
D. Remove the existing policies from the existing 1AM role. Add theAmazonSSMManagedlnstanceCore policy to the existing 1AM role.


Question # 96

A company has data collection sensors at different locations. The data collection sensorsstream a high volume of data to the company. The company wants to design a platform onAWS to ingest and process high-volume streaming data. The solution must be scalable andsupport data collection in near real time. The company must store the data in Amazon S3for future reporting.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon Kinesis Data Firehose to deliver streaming data to Amazon S3.
B. Use AWS Glue to deliver streaming data to Amazon S3.
C. Use AWS Lambda to deliver streaming data and store the data to Amazon S3.
D. Use AWS Database Migration Service (AWS DMS) to deliver streaming data to AmazonS3.


Question # 97

A company has a web application that includes an embedded NoSQL database. Theapplication runs on Amazon EC2 instances behind an Application Load Balancer (ALB).The instances run in an Amazon EC2 Auto Scaling group in a single Availability Zone.A recent increase in traffic requires the application to be highly available and for thedatabase to be eventually consistentWhich solution will meet these requirements with the LEAST operational overhead?

A. Replace the ALB with a Network Load Balancer Maintain the embedded NoSQLdatabase with its replication service on the EC2 instances.
B. Replace the ALB with a Network Load Balancer Migrate the embedded NoSQLdatabase to Amazon DynamoDB by using AWS Database Migration Service (AWS DMS).
C. Modify the Auto Scaling group to use EC2 instances across three Availability Zones.Maintain the embedded NoSQL database with its replication service on the EC2 instances.
D. Modify the Auto Scaling group to use EC2 instances across three Availability Zones.Migrate the embedded NoSQL database to Amazon DynamoDB by using AWS DatabaseMigration Service (AWS DMS).ccccccccc


Question # 98

A company's developers want a secure way to gain SSH access on the company's Amazon EC2 instances that run the latest version of Amazon Linux. The developers workremotely and in the corporate office.The company wants to use AWS services as a part of the solution. The EC2 instances arehosted in a VPC private subnet and access the internet through a NAT gateway that isdeployed in a public subnet.What should a solutions architect do to meet these requirements MOST cost-effectively?

A. Create a bastion host in the same subnet as the EC2 instances. Grant the ec2:CreateVpnConnection 1AM permission to the developers. Install EC2 Instance Connect sothat the developers can connect to the EC2 instances.
B. Create an AWS Site-to-Site VPN connection between the corporate network and theVPC. Instruct the developers to use the Site-to-Site VPN connection to access the EC2instances when the developers are on the corporate network. Instruct the developers to setup another VPN connection for access when they work remotely.
C. Create a bastion host in the public subnet of the VPC. Configure the security groups andSSH keys of the bastion host to only allow connections and SSH authentication from thedevelopers' corporate and remote networks. Instruct the developers to connect through thebastion host by using SSH to reach the EC2 instances.
D. Attach the AmazonSSMManagedlnstanceCore 1AM policy to an 1AM role that isassociated with the EC2 instances. Instruct the developers to use AWS Systems ManagerSession Manager to access the EC2 instances.


Question # 99

A company uses an organization in AWS Organizations to manage AWS accounts thatcontain applications. The company sets up a dedicated monitoring member account in theorganization. The company wants to query and visualize observability data across theaccounts by using Amazon CloudWatch.Which solution will meet these requirements?

A. Enable CloudWatch cross-account observability for the monitoring account. Deploy anAWS CloudFormation template provided by the monitoring account in each AWS accountto share the data with the monitoring account.
B. Set up service control policies (SCPs) to provide access to CloudWatch in themonitoring account under the Organizations root organizational unit (OU).
C. Configure a new IAM user in the monitoring account. In each AWS account, configurean 1AM policy to have access to query and visualize the CloudWatch data in the account.Attach the new 1AM policy to the new 1AM user.
D. Create a new IAM user in the monitoring account. Create cross-account 1AM policies ineach AWS account. Attach the 1AM policies to the new IAM user.


Question # 100

A company hosts a database that runs on an Amazon RDS instance that is deployed tomultiple Availability Zones. The company periodically runs a script against the database toreport new entries that are added to the database. The script that runs against thedatabase negatively affects the performance of a critical application. The company needsto improve application performance with minimal costs.Which solution will meet these requirements with the LEAST operational overhead?

A. Add functionality to the script to identify the instance that has the fewest activeconnections. Configure the script to read from that instance to report the total new entries.
B. Create a read replica of the database. Configure the script to query only the read replicato report the total new entries.
C. Instruct the development team to manually export the new entries for the day in thedatabase at the end of each day.
D. Use Amazon ElastiCache to cache the common queries that the script runs against thedatabase.


Question # 101

A company wants to use an AWS CloudFormatlon stack for its application in a testenvironment. The company stores the CloudFormation template in an Amazon S3 bucketthat blocks public access. The company wants to grant CloudFormation access to thetemplate in the S3 bucket based on specific user requests to create the test environmentThe solution must follow security best practices.Which solution will meet these requirements?

A. Create a gateway VPC endpoint for Amazon S3. Configure the CloudFormation stack touse the S3 object URL
B. Create an Amazon API Gateway REST API that has the S3 bucket as the target.Configure the CloudFormat10n stack to use the API Gateway URL _
C. Create a presigned URL for the template object_ Configure the CloudFormation stack touse the presigned URL.
D. Allow public access to the template object in the S3 bucket. Block the public accessafter the test environment is created


Question # 102

A solutions architect needs to copy files from an Amazon S3 bucket to an Amazon ElasticFile System (Amazon EFS) file system and another S3 bucket. The files must be copiedcontinuously. New files are added to the original S3 bucket consistently. The copied filesshould be overwritten only if the source file changes.Which solution will meet these requirements with the LEAST operational overhead?

A. Create an AWS DataSync location for both the destination S3 bucket and the EFS filesystem. Create a task for the destination S3 bucket and the EFS file system. Set thetransfer mode to transfer only data that has changed.
B. Create an AWS Lambda function. Mount the file system to the function. Set up an S3event notification to invoke the function when files are created and changed in Amazon S3.Configure the function to copy files to the file system and the destination S3 bucket.
C. Create an AWS DataSync location for both the destination S3 bucket and the EFS filesystem. Create a task for the destination S3 bucket and the EFS file system. Set thetransfer mode to transfer all data.
D. Launch an Amazon EC2 instance in the same VPC as the file system. Mount the filesystem. Create a script to routinely synchronize all objects that changed in the origin S3bucket to the destination S3 bucket and the mounted file system.


Question # 103

The DNS provider that hosts a company's domain name records is experiencing outagesthat cause service disruption for a website running on AWS. The company needs tomigrate to a more resilient managed DNS service and wants the service to run on AWS.What should a solutions architect do to rapidly migrate the DNS hosting service?

A. Create an Amazon Route 53 public hosted zone for the domain name. Import the zonefile containing the domain records hosted by the previous provider
B. Create an Amazon Route 53 private hosted zone for the domain name Import the zonefile containing the domain records hosted by the previous provider.
C. Create a Simple AD directory in AWS. Enable zone transfer between the DNS providerand AWS Directory Service for Microsoft Active Directory for the domain records.
D. Create an Amazon Route 53 Resolver inbound endpomt in the VPC. Specify the IPaddresses that the provider's DNS will forward DNS queries to. Configure the provider'sDNS to forward DNS queries for the domain to the IP addresses that are specified in theinbound endpoint.


Question # 104

A company has an online gaming application that has TCP and UDP multiplayer gamingcapabilities. The company uses Amazon Route 53 to point the application traffic to multipleNetwork Load Balancers (NLBs) in different AWS Regions. The company needs to improveapplication performance and decrease latency for the online game in preparation for usergrowth.Which solution will meet these requirements?

A. Add an Amazon CloudFront distribution in front of the NLBs. Increase the Cache-Control: max-age parameter.
B. Replace the NLBs with Application Load Balancers (ALBs). Configure Route 53 to uselatency-based routing.
C. Add AWS Global Accelerator in front of the NLBs. Configure a Global Acceleratorendpoint to use the correct listener ports.
D. ‘Add an Amazon API Gateway endpoint behind the NLBs. Enable API caching. Overridemethod caching for the different stages.


Question # 105

A research company runs experiments that are powered by a simulation application and avisualization application. The simulation application runs on Linux and outputs intermediatedata to an NFS share every 5 minutes. The visualization application is a Windows desktopapplication that displays the simulation output and requires an SMB file system.The company maintains two synchronized file systems. This strategy is causing dataduplication and inefficient resource usage. The company needs to migrate the applicationsto AWS without making code changes to either application.Which solution will meet these requirements?

A. Migrate both applications to AWS Lambda. Create an Amazon S3 bucket toexchange data between the applications.
B. Migrate both applications to Amazon Elastic Container Service (Amazon ECS).Configure Amazon FSx File Gateway for storage.
C. Migrate the simulation application to Linux Amazon EC2 instances. Migrate thevisualization application to Windows EC2 instances. Configure Amazon Simple QueueService (Amazon SQS) to exchange data between the applications.
D. Migrate the simulation application to Linux Amazon EC2 instances. Migrate thevisualization application to Windows EC2 instances. Configure Amazon FSx for NetAppONTAP for storage.


Question # 106

A solutions architect is designing an AWS Identity and Access Management (1AM)authorization model for a company's AWS account. The company has designated fivespecific employees to have full access to AWS services and resources in the AWS account.The solutions architect has created an 1AM user for each of the five designated employeesand has created an 1AM user group.Which solution will meet these requirements?

A. Attach the AdministratorAccess resource-based policy to the 1AM user group. Placeeach of the five designated employee IAM users in the 1AM user group.
B. Attach the SystemAdministrator identity-based policy to the IAM user group. Place eachof the five designated employee IAM users in the IAM user group.
C. Attach the AdministratorAccess identity-based policy to the IAM user group. Place eachof the five designated employee IAM users in the IAM user group.
D. Attach the SystemAdministrator resource-based policy to the IAM user group. Placeeach of the five designated employee IAM users in the IAM user group.


Question # 107

A company's ecommerce website has unpredictable traffic and uses AWS Lambdafunctions to directly access a private Amazon RDS for PostgreSQL DB instance. Thecompany wants to maintain predictable database performance and ensure that the Lambdainvocations do not overload the database with too many connections.What should a solutions architect do to meet these requirements?

A. Point the client driver at an RDS custom endpoint. Deploy the Lambda functions inside aVPC.
B. Point the client driver at an RDS proxy endpoint. Deploy the Lambda functions inside aVPC.
C. Point the client driver at an RDS custom endpoint. Deploy the Lambda functions outsidea VPC.
D. Point the client driver at an RDS proxy endpoint. Deploy the Lambda functions outside aVPC.


Question # 108

A company wants to analyze and generate reports to track the usage of its mobile app. Theapp is popular and has a global user base The company uses a custom report buildingprogram to analyze application usage.The program generates multiple reports during the last week of each month. The programtakes less than 10 minutes to produce each report. The company rarely uses the program to generate reports outside of the last week of each month. The company wants togenerate reports in the least amount of time when the reports are requested.Which solution will meet these requirements MOST cost-effectively?

A. Run the program by using Amazon EC2 On-Demand Instances. Create an AmazonEventBridge rule to start the EC2 instances when reports are requested. Run the EC2instances continuously during the last week of each month.
B. Run the program in AWS Lambda. Create an Amazon EventBridge rule to run a Lambdafunction when reports are requested.
C. Run the program in Amazon Elastic Container Service (Amazon ECS). ScheduleAmazon ECS to run the program when reports are requested.
D. Run the program by using Amazon EC2 Spot Instances. Create an AmazonEventBridge rule to start the EC2 instances when reports are requested. Run the EC2instances continuously during the last week of each month.


Question # 109

A company has established a new AWS account. The account is newly provisioned and nochanges have been made to the default settings. The company is concerned about thesecurity of the AWS account root user.What should be done to secure the root user?

A. Create 1AM users for daily administrative tasks. Disable the root user.
B. Create 1AM users for daily administrative tasks. Enable multi-factor authentication onthe root user.
C. Generate an access key for the root user Use the access key for daily administrationtasks instead of the AWS Management Console.
D. Provide the root user credentials to the most senior solutions architect. Have thesolutions architect use the root user for daily administration tasks.


Question # 110

A company has users all around the world accessing its HTTP-based application deployedon Amazon EC2 instances in multiple AWS Regions. The company wants to improve theavailability and performance of the application. The company also wants to protect theapplication against common web exploits that may affect availability, compromise security, or consume excessive resources. Static IP addresses are required.What should a solutions architect recommend to accomplish this?

A. Put the EC2 instances behind Network Load Balancers (NLBs) in each Region. DeployAWS WAF on the NLBs. Create an accelerator using AWS Global Accelerator and registerthe NLBs as endpoints.
B. Put the EC2 instances behind Application Load Balancers (ALBs) in each Region.Deploy AWS WAF on the ALBs. Create an accelerator using AWS Global Accelerator andregister the ALBs as endpoints.
C. Put the EC2 instances behind Network Load Balancers (NLBs) in each Region. DeployAWS WAF on the NLBs. Create an Amazon CloudFront distribution with an origin that usesAmazon Route 53 latency-based routing to route requests to the NLBs.
D. Put the EC2 instances behind Application Load Balancers (ALBs) in each Region.Create an Amazon CloudFront distribution with an origin that uses Amazon Route 53latency-based routing to route requests to the ALBs. Deploy AWS WAF on the CloudFrontdistribution.


Question # 111

A company runs multiple workloads in its on-premises data center. The company's datacenter cannot scale fast enough to meet the company's expanding business needs. Thecompany wants to collect usage and configuration data about the on-premises servers andworkloads to plan a migration to AWS.Which solution will meet these requirements?

A. Set the home AWS Region in AWS Migration Hub. Use AWS Systems Manager tocollect data about the on-premises servers.
B. Set the home AWS Region in AWS Migration Hub. Use AWS Application DiscoveryService to collect data about the on-premises servers.
C. Use the AWS Schema Conversion Tool (AWS SCT) to create the relevant templates.Use AWS Trusted Advisor to collect data about the on-premises servers.
D. Use the AWS Schema Conversion Tool (AWS SCT) to create the relevant templates.Use AWS Database Migration Service (AWS DMS) to collect data about the on-premisesservers.


Question # 112

A company is designing a new web service that will run on Amazon EC2 instances behindan Elastic Load Balancing (ELB) load balancer. However, many of the web service clientscan only reach IP addresses authorized on their firewalls.What should a solutions architect recommend to meet the clients' needs?

A. A Network Load Balancer with an associated Elastic IP address.
B. An Application Load Balancer with an associated Elastic IP address.
C. An A record in an Amazon Route 53 hosted zone pointing to an Elastic IP address.
D. An EC2 instance with a public IP address running as a proxy in front of the loadbalancer.


Question # 113

A company is running its production and nonproduction environment workloads in multipleAWS accounts. The accounts are in an organization in AWS Organizations. The companyneeds to design a solution that will prevent the modification of cost usage tags.Which solution will meet these requirements?

A. Create a custom AWS Config rule to prevent tag modification except by authorizedprincipals.
B. Create a custom trail in AWS CloudTrail to prevent tag modification
C. Create a service control policy (SCP) to prevent tag modification except by authonzedprincipals.
D. Create custom Amazon CloudWatch logs to prevent tag modification.


Question # 114

A company stores multiple Amazon Machine Images (AMIs) in an AWS account to launchits Amazon EC2 instances. The AMIs contain critical data and configurations that arenecessary for the company's operations. The company wants to implement a solution thatwill recover accidentally deleted AMIs quickly and efficiently.Which solution will meet these requirements with the LEAST operational overhead?

A. Create Amazon Elastic Block Store (Amazon EBS) snapshots of the AMIs. Store thesnapshots in a separate AWS account.
B. Copy all AMIs to another AWS account periodically.
C. Create a retention rule in Recycle Bin.
D. Upload the AMIs to an Amazon S3 bucket that has Cross-Region Replication.


Question # 115

An ecommerce company is running a seasonal online sale. The company hosts its websiteon Amazon EC2 instances spanning multiple Availability Zones. The company wants itswebsite to manage sudden traffic increases during the sale.Which solution will meet these requirements MOST cost-effectively?

A. Create an Auto Scaling group that is large enough to handle peak traffic load. Stop half of the Amazon EC2 instances. Configure the Auto Scaling group to use the stoppedinstances to scale out when traffic increases.
B. Create an Auto Scaling group for the website. Set the minimum size of the Auto Scalinggroup so that it can handle high traffic volumes without the need to scale out.
C. Use Amazon CIoudFront and Amazon ElastiCache to cache dynamic content with anAuto Scaling group set as the origin. Configure the Auto Scaling group with the instancesnecessary to populate CIoudFront and ElastiCache. Scale in after the cache is fullypopulated.
D. Configure an Auto Scaling group to scale out as traffic increases. Create a launchtemplate to start new instances from a preconfigured Amazon Machine Image (AMI).


Question # 116

A company needs a solution to prevent photos with unwanted content from being uploadedto the company's web application. The solution must not involve training a machinelearning (ML) model. Which solution will meet these requirements?

A. Create and deploy a model by using Amazon SageMaker Autopilot. Create a real-timeendpoint that the web application invokes when new photos are uploaded.
B. Create an AWS Lambda function that uses Amazon Rekognition to detect unwantedcontent. Create a Lambda function URL that the web application invokes when new photosare uploaded.
C. Create an Amazon CloudFront function that uses Amazon Comprehend to detectunwanted content. Associate the function with the web application.
D. Create an AWS Lambda function that uses Amazon Rekognition Video to detectunwanted content. Create a Lambda function URL that the web application invokes whennew photos are uploaded.


Question # 117

A solutions architect creates a VPC that includes two public subnets and two privatesubnets. A corporate security mandate requires the solutions architect to launch allAmazon EC2 instances in a private subnet. However, when the solutions architectlaunches an EC2 instance that runs a web server on ports 80 and 443 in a private subnet,no external internet traffic can connect to the server.What should the solutions architect do to resolve this issue?

A. Attach the EC2 instance to an Auto Scaling group in a private subnet. Ensure that theDNS record for the website resolves to the Auto Scaling group identifier.
B. Provision an internet-facing Application Load Balancer (ALB) in a public subnet. Add theEC2 instance to the target group that is associated with the ALB. Ensure that the DNSrecord for the website resolves to the ALB.
C. Launch a NAT gateway in a private subnet. Update the route table for the privatesubnets to add a default route to the NAT gateway. Attach a public Elastic IP address tothe NAT gateway.
D. Ensure that the security group that is attached to the EC2 instance allows HTTP trafficon port 80 and HTTPS traffic on port 443. Ensure that the DNS record for the websiteresolves to the public IP address of the EC2 instance.


Question # 118

A company has deployed its newest product on AWS. The product runs in an Auto Scalinggroup behind a Network Load Balancer. The company stores the product's objects in anAmazon S3 bucket.The company recently experienced malicious attacks against its systems. The companyneeds a solution that continuously monitors for malicious activity in the AWS account,workloads, and access patterns to the S3 bucket. The solution must also report suspiciousactivity and display the information on a dashboard.Which solution will meet these requirements?

A. Configure Amazon Made to monitor and report findings to AWS Config.
B. Configure Amazon Inspector to monitor and report findings to AWS CloudTrail.
C. Configure Amazon GuardDuty to monitor and report findings to AWS Security Hub.
D. Configure AWS Config to monitor and report findings to Amazon EventBridge.


Question # 119

A development team is collaborating with another company to create an integrated product.The other company needs to access an Amazon Simple Queue Service (Amazon SQS)queue that is contained in the development team's account. The other company wants topoll the queue without giving up its own account permissions to do so.How should a solutions architect provide access to the SQS queue?

A. Create an instance profile that provides the other company access to the SQS queue.
B. Create an 1AM policy that provides the other company access to the SQS queue.
C. Create an SQS access policy that provides the other company access to the SQSqueue.
D. Create an Amazon Simple Notification Service (Amazon SNS) access policy thatprovides the other company access to the SQS queue.


Question # 120

A company has an organization in AWS Organizations. The company runs Amazon EC2instances across four AWS accounts in the root organizational unit (OU). There are threenonproduction accounts and one production account. The company wants to prohibit usersfrom launching EC2 instances of a certain size in the nonproduction accounts. Thecompany has created a service control policy (SCP) to deny access to launch instancesthat use the prohibited types.Which solutions to deploy the SCP will meet these requirements? (Select TWO.)

A. Attach the SCP to the root OU for the organization.
B. Attach the SCP to the three nonproduction Organizations member accounts.
C. Attach the SCP to the Organizations management account.
D. Create an OU for the production account. Attach the SCP to the OU. Move theproduction member account into the new OU.
E. Create an OU for the required accounts. Attach the SCP to the OU. Move thenonproduction member accounts into the new OU.


Question # 121

A company uses an organization in AWS Organizations to manage AWS accounts thatcontain applications. The company sets up a dedicated monitoring member account in theorganization. The company wants to query and visualize observability data across theaccounts by using Amazon CloudWatch.Which solution will meet these requirements?

A. Enable CloudWatch cross-account observability for the monitoring account. Deploy anAWS CloudFormation template provided by the monitoring account in each AWS accountto share the data with the monitoring account.
B. Set up service control policies (SCPs) to provide access to CloudWatch in themonitoring account under the Organizations root organizational unit (OU).
C. Configure a new 1AM user in the monitoring account. In each AWS account, configurean 1AM policy to have access to query and visualize the CloudWatch data in the account.Attach the new 1AM policy to the new I AM user.
D. Create a new 1AM user in the monitoring account. Create cross-account 1AM policies ineach AWS account. Attach the 1AM policies to the new 1AM user.


Question # 122

A company wants to rearchitect a large-scale web application to a serverless microservicesarchitecture. The application uses Amazon EC2 instances and is written in Python.The company selected one component of the web application to test as a microservice.The component supports hundreds of requests each second. The company wants to createand test the microservice on an AWS solution that supports Python. The solution must alsoscale automatically and require minimal infrastructure and minimal operational support. Which solution will meet these requirements?

A. Use a Spot Fleet with auto scaling of EC2 instances that run the most recent AmazonLinux operating system.
B. Use an AWS Elastic Beanstalk web server environment that has high availabilityconfigured.
C. Use Amazon Elastic Kubernetes Service (Amazon EKS). Launch Auto Scaling groups ofself-managed EC2 instances.
D. Use an AWS Lambda function that runs custom developed code.


Question # 123

A company uses an on-premises network-attached storage (NAS) system to provide fileshares to its high performance computing (HPC) workloads. The company wants to migrateits latency-sensitive HPC workloads and its storage to the AWS Cloud. The company mustbe able to provide NFS and SMB multi-protocol access from the file system.Which solution will meet these requirements with the LEAST latency? (Select TWO.)

A. Deploy compute optimized EC2 instances into a cluster placement group.
B. Deploy compute optimized EC2 instances into a partition placement group.
C. Attach the EC2 instances to an Amazon FSx for Lustre file system.
D. Attach the EC2 instances to an Amazon FSx for OpenZFS file system.
E. Attach the EC2 instances to an Amazon FSx for NetApp ONTAP file system.


Question # 124

A company needs to connect several VPCs in the us-east-1 Region that span hundreds ofAWS accounts. The company's networking team has its own AWS account to manage thecloud network.What is the MOST operationally efficient solution to connect the VPCs?

A. Set up VPC peering connections between each VPC. Update each associated subnet'sroute table.
B. Configure a NAT gateway and an internet gateway in each VPC to connect each VPCthrough the internet.
C. Create an AWS Transit Gateway in the networking team's AWS account. Configurestatic routes from each VPC.
D. Deploy VPN gateways in each VPC. Create a transit VPC in the networking team's AWSaccount to connect to each VPC.


Question # 125

A company’s infrastructure consists of Amazon EC2 instances and an Amazon RDS DBinstance in a single AWS Region. The company wants to back up its data in a separateRegion.Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS Backup to copy EC2 backups and RDS backups to the separate Region.
B. Use Amazon Data Lifecycle Manager (Amazon DLM) to copy EC2 backups and RDSbackups to the separate Region.
C. Create Amazon Machine Images (AMIs) of the EC2 instances. Copy the AMIs to theseparate Region. Create a read replica for the RDS DB instance in the separate Region.
D. Create Amazon Elastic Block Store (Amazon EBS) snapshots. Copy the EBS snapshotsto the separate Region. Create RDS snapshots. Export the RDS snapshots to Amazon S3. Configure S3 Cross-Region Replication (CRR) to the separate Region.


Question # 126

A company has a large workload that runs every Friday evening. The workload runs onAmazon EC2 instances that are in two Availability Zones in the us-east-1 Region. Normally,the company must run no more than two instances at all times. However, the companywants to scale up to six instances each Friday to handle a regularly repeating increasedworkload.Which solution will meet these requirements with the LEAST operational overhead?

A. Create a reminder in Amazon EventBridge to scale the instances.
B. Create an Auto Scaling group that has a scheduled action.
C. Create an Auto Scaling group that uses manual scaling.
D. Create an Auto Scaling group that uses automatic scaling.


Question # 127

A company has deployed its application on Amazon EC2 instances with an Amazon RDSdatabase. The company used the principle of least privilege to configure the databaseaccess credentials. The company's security team wants to protect the application and thedatabase from SQL injection and other web-based attacks.Which solution will meet these requirements with the LEAST operational overhead?

A. Use security groups and network ACLs to secure the database and application servers.
B. Use AWS WAF to protect the application. Use RDS parameter groups to configure thesecurity settings.
C. Use AWS Network Firewall to protect the application and the database.
D. Use different database accounts in the application code for different functions. Avoidgranting excessive privileges to the database users.


Question # 128

A solutions architect is creating a new Amazon CloudFront distribution for an application.Some of the information submitted by users is sensitive. The application uses HTTPS butneeds another layer of security. The sensitive information should.be protected throughoutthe entire application stack, and access to the information should be restricted to certainapplications.Which action should the solutions architect take?

A. Configure a CloudFront signed URL.
B. Configure a CloudFront signed cookie.
C. Configure a CloudFront field-level encryption profile.
D. Configure CloudFront and set the Origin Protocol Policy setting to HTTPS Only for theViewer Protocol Policy.


Question # 129

A company manages AWS accounts in AWS Organizations. AWS 1AM Identity Center(AWS Single Sign-On) and AWS Control Tower are configured for the accounts. Thecompany wants to manage multiple user permissions across all the accounts.The permissions will be used by multiple 1AM users and must be split between thedeveloper and administrator teams. Each team requires different permissions. Thecompany wants a solution that includes new users that are hired on both teams.Which solution will meet these requirements with the LEAST operational overhead?

A. Create individual users in 1AM Identity Center (or each account. Create separatedeveloper and administrator groups in 1AM Identity Center. Assign the users to theappropriate groups Create a custom 1AM policy for each group to set fine-grainedpermissions.
B. Create individual users in 1AM Identity Center for each account. Create separatedeveloper and administrator groups in 1AM Identity Center. Assign the users to theappropriate groups. Attach AWS managed 1AM policies to each user as needed for finegrainedpermissions.
C. Create individual users in 1AM Identity Center Create new developer and administratorgroups in 1AM Identity Center. Create new permission sets that include the appropriate1AM policies for each group. Assign the new groups to the appropriate accounts Assign thenew permission sets to the new groups When new users are hired, add them to theappropriate group.
D. Create individual users in 1AM Identity Center. Create new permission sets that includethe appropriate 1AM policies for each user. Assign the users to the appropriate accounts.Grant additional 1AM permissions to the users from within specific accounts. When newusers are hired, add them to 1AM Identity Center and assign them to the accounts.


Question # 130

A company is deploying an application in three AWS Regions using an Application LoadBalancer Amazon Route 53 will be used to distribute traffic between these Regions. WhichRoute 53 configuration should a solutions architect use to provide the MOST highperformingexperience?

A. Create an A record with a latency policy.
B. Create an A record with a geolocation policy.
C. Create a CNAME record with a failover policy.
D. Create a CNAME record with a geoproximity policy.


Question # 131

A company wants to migrate its three-tier application from on premises to AWS. The webtier and the application tier are running on third-party virtual machines (VMs). The databasetier is running on MySQL.The company needs to migrate the application by making the fewest possible changes tothe architecture. The company also needs a database solution that can restore data to aspecific point in time.Which solution will meet these requirements with the LEAST operational overhead?

A. Migrate the web tier and the application tier to Amazon EC2 instances in privatesubnets. Migrate the database tier to Amazon RDS for MySQL in private subnets.
B. Migrate the web tier to Amazon EC2 instances in public subnets. Migrate the applicationtier to EC2 instances in private subnets. Migrate the database tier to Amazon AuroraMySQL in private subnets.
C. Migrate the web tier to Amazon EC2 instances in public subnets. Migrate the applicationtier to EC2 instances in private subnets. Migrate the database tier to Amazon RDS forMySQL in private subnets.
D. Migrate the web tier and the application tier to Amazon EC2 instances in public subnets.Migrate the database tier to Amazon Aurora MySQL in public subnets.


Question # 132

A solutions architect must provide an automated solution for a company's compliancepolicy that states security groups cannot include a rule that allows SSH from 0.0.0.0/0. Thecompany needs to be notified if there is any breach in the policy. A solution is needed assoon as possible.What should the solutions architect do to meet these requirements with the LEASToperational overhead?

A. Write an AWS Lambda script that monitors security groups for SSH being open to0.0.0.0/0 addresses and creates a notification every time it finds one.
B. Enable the restricted-ssh AWS Config managed rule and generate an Amazon SimpleNotification Service (Amazon SNS) notification when a noncompliant rule is created.
C. Create an 1AM role with permissions to globally open security groups and networkACLs. Create an Amazon Simple Notification Service (Amazon SNS) topic to generate anotification every time the role is assumed by a user.
D. Configure a service control policy (SCP) that prevents non-administrative users fromcreating or editing security groups. Create a notification in the ticketing system when a userrequests a rule that needs administrator permissions.


Question # 133

A company is developing an application that will run on a production Amazon ElasticKubernetes Service (Amazon EKS) cluster The EKS cluster has managed node groupsthat are provisioned with On-Demand Instances.The company needs a dedicated EKS cluster for development work. The company will usethe development cluster infrequently to test the resiliency of the application. The EKScluster must manage all the nodes.Which solution will meet these requirements MOST cost-effectively?

A. Create a managed node group that contains only Spot Instances.
B. Create two managed node groups. Provision one node group with On-DemandInstances. Provision the second node group with Spot Instances.
C. Create an Auto Scaling group that has a launch configuration that uses Spot Instances.Configure the user data to add the nodes to the EKS cluster.
D. Create a managed node group that contains only On-Demand Instances.


Question # 134

A company is deploying an application that processes large quantities of data in parallel.The company plans to use Amazon EC2 instances for the workload. The network architecture must be configurable to prevent groups of nodes from sharing the sameunderlying hardware.Which networking solution meets these requirements?

A. Run the EC2 instances in a spread placement group.
B. Group the EC2 instances in separate accounts.
C. Configure the EC2 instances with dedicated tenancy.
D. Configure the EC2 instances with shared tenancy.


Question # 135

A company is preparing a new data platform that will ingest real-time streaming data frommultiple sources. The company needs to transform the data before writing the data toAmazon S3. The company needs the ability to use SQL to query the transformed data. Which solutions will meet these requirements? (Choose two.)

A. Use Amazon Kinesis Data Streams to stream the data. Use Amazon Kinesis DataAnalytics to transform the data. Use Amazon Kinesis Data Firehose to write the data toAmazon S3. Use Amazon Athena to query the transformed data from Amazon S3.
B. Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to stream the data.Use AWS Glue to transform the data and to write the data to Amazon S3. Use AmazonAthena to query the transformed data from Amazon S3.
C. Use AWS Database Migration Service (AWS DMS) to ingest the data. Use AmazonEMR to transform the data and to write the data to Amazon S3. Use Amazon Athena toquery the transformed data from Amazon S3.
D. Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to stream the data.Use Amazon Kinesis Data Analytics to transform the data and to write the data to AmazonS3. Use the Amazon RDS query editor to query the transformed data from Amazon S3.
E. Use Amazon Kinesis Data Streams to stream the data. Use AWS Glue to transform thedata. Use Amazon Kinesis Data Firehose to write the data to Amazon S3. Use the AmazonRDS query editor to query the transformed data from Amazon S3.


Question # 136

A company runs an application on Amazon EC2 instances. The company needs toimplement a disaster recovery (DR) solution for the application. The DR solution needs tohave a recovery time objective (RTO) of less than 4 hours. The DR solution also needs touse the fewest possible AWS resources during normal operations.Which solution will meet these requirements in the MOST operationally efficient way?

A. Create Amazon Machine Images (AMIs) to back up the EC2 instances. Copy the AMIsto a secondary AWS Region. Automate infrastructure deployment in the secondary Regionby using AWS Lambda and custom scripts.
B. Create Amazon Machine Images (AMIs) to back up the EC2 instances. Copy the AMIs to a secondary AWS Region. Automate infrastructure deployment in the secondary Regionby using AWS CloudFormation.
C. Launch EC2 instances in a secondary AWS Region. Keep the EC2 instances in thesecondary Region active at all times.
D. Launch EC2 instances in a secondary Availability Zone. Keep the EC2 instances in thesecondary Availability Zone active at all times.


Question # 137

A solutions architect needs to review a company's Amazon S3 buckets to discoverpersonally identifiable information (Pll). The company stores the Pll data in the us-east-IRegion and us-west-2 Region.Which solution will meet these requirements with the LEAST operational overhead?

A. Configure Amazon Macie in each Region. Create a job to analyze the data that is inAmazon S3_
B. Configure AWS Security Hub for all Regions. Create an AWS Config rule to analyze thedata that is in Amazon S3_
C. Configure Amazon Inspector to analyze the data that IS in Amazon S3.
D. Configure Amazon GuardDuty to analyze the data that is in Amazon S3.


Question # 138

A solutions architect is designing a workload that will store hourly energy consumption bybusiness tenants in a building. The sensors will feed a database through HTTP requeststhat will add up usage for each tenant. The solutions architect must use managed serviceswhen possible. The workload will receive more features in the future as the solutionsarchitect adds independent components.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon API Gateway with AWS Lambda functions to receive the data from thesensors, process the data, and store the data in an Amazon DynamoDB table.
B. Use an Elastic Load Balancer that is supported by an Auto Scaling group of AmazonEC2 instances to receive and process the data from the sensors. Use an Amazon S3bucket to store the processed data.
C. Use Amazon API Gateway with AWS Lambda functions to receive the data from thesensors, process the data, and store the data in a Microsoft SQL Server Express databaseon an Amazon EC2 instance.
D. Use an Elastic Load Balancer that is supported by an Auto Scaling group of AmazonEC2 instances to receive and process the data from the sensors. Use an Amazon ElasticFile System (Amazon EFS) shared file system to store the processed data.


Question # 139

A company is deploying a new public web application toAWS. The application Will runbehind an Application Load Balancer (ALE). The application needs to be encrypted at theedge with an SSL/TLS certificate that is issued by an external certificate authority (CA).The certificate must be rotated each year before the certificate expires.What should a solutions architect do to meet these requirements?

A. Use AWS Certificate Manager (ACM) to issue an SSUTLS certificate. Apply thecertificate to the ALB Use the managed renewal feature to automatically rotate thecertificate.
B. Use AWS Certificate Manager (ACM) to issue an SSUTLS certificate_ Import the keymaterial from the certificate. Apply the certificate to the ALB Use the managedrenewal teature to automatically rotate the certificate.
C. Use AWS Private Certificate Authority to issue an SSL/TLS certificate from the root CA.Apply the certificate to the ALB. use the managed renewal feature to automatically rotate the certificate
D. Use AWS Certificate Manager (ACM) to import an SSL/TLS certificate. Apply thecertificate to the ALB_ Use Amazon EventBridge to send a notification when the certificateis nearing expiration. Rotate the certificate manually.


Question # 140

A company runs an infrastructure monitoring service. The company is building a newfeature that will enable the service to monitor data in customer AWS accounts. The newfeature will call AWS APIs in customer accounts to describe Amazon EC2 instances andread Amazon CloudWatch metrics.What should the company do to obtain access to customer accounts in the MOST secureway?

A. Ensure that the customers create an 1AM role in their account with read-only EC2 andCloudWatch permissions and a trust policy to the company's account.
B. Create a serverless API that implements a token vending machine to provide temporaryAWS credentials for a role with read-only EC2 and CloudWatch permissions.
C. Ensure that the customers create an 1AM user in their account with read-only EC2 andCloudWatch permissions. Encrypt and store customer access and secret keys in a secretsmanagement system.
D. Ensure that the customers create an Amazon Cognito user in their account to use an1AM role with read-only EC2 and CloudWatch permissions. Encrypt and store the AmazonCognito user and password in a secrets management system.


Question # 141

A company has two VPCs that are located in the us-west-2 Region within the same AWSaccount. The company needs to allow network traffic between these VPCs. Approximately500 GB of data transfer will occur between the VPCs each month.What is the MOST cost-effective solution to connect these VPCs?

A. Implement AWS Transit Gateway to connect the VPCs. Update the route tables of eachVPC to use the transit gateway for inter-VPC communication.
B. Implement an AWS Site-to-Site VPN tunnel between the VPCs. Update the route tablesof each VPC to use the VPN tunnel for inter-VPC communication.
C. Set up a VPC peering connection between the VPCs. Update the route tables of eachVPC to use the VPC peering connection for inter-VPC communication.
D. Set up a 1 GB AWS Direct Connect connection between the VPCs. Update the routetables of each VPC to use the Direct Connect connection for inter-VPC communication.


Question # 142

A company runs a web application that is deployed on Amazon EC2 instances in theprivate subnet of a VPC. An Application Load Balancer (ALB) that extends across thepublic subnets directs web traffic to the EC2 instances. The company wants to implementnew security measures to restrict inbound traffic from the ALB to the EC2 instances whilepreventing access from any other source inside or outside the private subnet of the EC2instances. Which solution will meet these requirements?

A. Configure a route in a route table to direct traffic from the internet to the private IPaddresses of the EC2 instances.
B. Configure the security group for the EC2 instances to only allow traffic that comes fromthe security group for the ALB.
C. Move the EC2 instances into the public subnet. Give the EC2 instances a set of ElasticIP addresses.
D. Configure the security group for the ALB to allow any TCP traffic on any port.


Question # 143

A company runs applications on AWS that connect to the company's Amazon RDSdatabase. The applications scale on weekends and at peak times of the year. Thecompany wants to scale the database more effectively for its applications that connect tothe database.Which solution will meet these requirements with the LEAST operational overhead

A. Use Amazon DynamoDB with connection pooling with a target group configuration forthe database. Change the applications to use the DynamoDB endpoint.
B. Use Amazon RDS Proxy with a target group for the database. Change the applicationsto use the RDS Proxy endpoint.
C. Use a custom proxy that runs on Amazon EC2 as an intermediary to the database.Change the applications to use the custom proxy endpoint.
D. Use an AWS Lambda function to provide connection pooling with a target groupconfiguration for the database. Change the applications to use the Lambda function.


Question # 144

A company runs an application that uses Amazon RDS for PostgreSQL. The applicationreceives traffic only on weekdays during business hours. The company wants to optimizecosts and reduce operational overhead based on this usage.Which solution will meet these requirements?

A. Use the Instance Scheduler on AWS to configure start and stop schedules.
B. Turn off automatic backups. Create weekly manual snapshots of the database.
C. Create a custom AWS Lambda function to start and stop the database based onminimum CPU utilization.
D. Purchase All Upfront reserved DB instances.


Question # 145

A company needs to connect several VPCs in the us-east-1 Region that span hundreds ofAWS accounts. The company's networking team has its own AWS account to manage thecloud network.What is the MOST operationally efficient solution to connect the VPCs?

A. Set up VPC peering connections between each VPC. Update each associated subnet'sroute table.
B. Configure a NAT gateway and an internet gateway in each VPC to connect each VPCthrough the internet.
C. Create an AWS Transit Gateway in the networking team's AWS account. Configurestatic routes from each VPC.
D. Deploy VPN gateways in each VPC. Create a transit VPC in the networking team's AWSaccount to connect to each VPC.


Question # 146

A company has created a multi-tier application for its ecommerce website. The websiteuses an Application Load Balancer that resides in the public subnets, a web tier in thepublic subnets, and a MySQL cluster hosted on Amazon EC2 instances in the privatesubnets. The MySQL database needs to retrieve product catalog and pricing informationthat is hosted on the internet by a third-party provider. A solutions architect must devise astrategy that maximizes security without increasing operational overhead.What should the solutions architect do to meet these requirements?

A. Deploy a NAT instance in the VPC. Route all the internet-based traffic through the NATinstance.
B. Deploy a NAT gateway in the public subnets. Modify the private subnet route table todirect all internet-bound traffic to the NAT gateway.
C. Configure an internet gateway and attach it to the VPC. Modify the private subnet routetable to direct internet-bound traffic to the internet gateway.
D. Configure a virtual private gateway and attach it to the VPC. Modify the private subnetroute table to direct internet-bound traffic to the virtual private gateway.


Question # 147

A solutions architect is designing a highly available Amazon ElastiCache for Redis basedsolution. The solutions architect needs to ensure that failures do not result in performancedegradation or loss of data locally and within an AWS Region. The solution needs toprovide high availability at the node level and at the Region level.Which solution will meet these requirements?

A. Use Multi-AZ Redis replication groups with shards that contain multiple nodes.
B. Use Redis shards that contain multiple nodes with Redis append only files (AOF) turedon.
C. Use a Multi-AZ Redis cluster with more than one read replica in the replication group.
D. Use Redis shards that contain multiple nodes with Auto Scaling turned on.


Question # 148

A company runs its applications on Amazon EC2 instances. The company performsperiodic financial assessments of itsAWS costs. The company recently identified unusualspending.The company needs a solution to prevent unusual spending. The solution must monitorcosts and notify responsible stakeholders in the event of unusual spending.Which solution will meet these requirements?

A. Use an AWS Budgets template to create a zero spend budget
B. Create an AWS Cost Anomaly Detection monitor in the AWS Billing and CostManagement console.
C. CreateAWS Pricing Calculator estimates for the current running workload pricingdetails_
D. Use Amazon CloudWatch to monitor costs and to identify unusual spending


Question # 149

A company wants to use an event-driven programming model with AWS Lambda. Thecompany wants to reduce startup latency for Lambda functions that run on Java 11. Thecompany does not have strict latency requirements for the applications. The companywants to reduce cold starts and outlier latencies when a function scales up.Which solution will meet these requirements MOST cost-effectively?

A. Configure Lambda provisioned concurrency.
B. Increase the timeout of the Lambda functions.
C. Increase the memory of the Lambda functions.
D. Configure Lambda SnapStart.


Question # 150

A company needs to minimize the cost of its 1 Gbps AWS Direct Connect connection. Thecompany's average connection utilization is less than 10%. A solutions architect mustrecommend a solution that will reduce the cost without compromising security.Which solution will meet these requirements?

A. Set up a new 1 Gbps Direct Connect connection. Share the connection with anotherAWS account.
B. Set up a new 200 Mbps Direct Connect connection in the AWS Management Console.
C. Contact an AWS Direct Connect Partner to order a 1 Gbps connection. Share theconnection with another AWS account.
D. Contact an AWS Direct Connect Partner to order a 200 Mbps hosted connection for anexisting AWS account.


Question # 151

A company runs a website that stores images of historical events. Website users need theability to search and view images based on the year that the event in the image occurred.On average, users request each image only once or twice a year The company wants ahighly available solution to store and deliver the images to users.Which solution will meet these requirements MOST cost-effectively?

A. Store images in Amazon Elastic Block Store (Amazon EBS). Use a web server that runson Amazon EC2_
B. Store images in Amazon Elastic File System (Amazon EFS). Use a web server that runson Amazon EC2.
C. Store images in Amazon S3 Standard. use S3 Standard to directly deliver images byusing a static website.
D. Store images in Amazon S3 Standard-InfrequentAccess (S3 Standard-IA). use S3Standard-IA to directly deliver images by using a static website.


Question # 152

A company runs a website that uses a content management system (CMS) on AmazonEC2. The CMS runs on a single EC2 instance and uses an Amazon Aurora MySQL Multi-AZ DB instance for the data tier. Website images are stored on an Amazon Elastic BlockStore (Amazon EBS) volume that is mounted inside the EC2 instance.Which combination of actions should a solutions architect take to improve the performanceand resilience of the website? (Select TWO.)

A. Move the website images into an Amazon S3 bucket that is mounted on every EC2instance.
B. Share the website images by using an NFS share from the primary EC2 instance. Mountthis share on the other EC2 instances.
C. Move the website images onto an Amazon Elastic File System (Amazon EFS) filesystem that is mounted on every EC2 instance.
D. Create an Amazon Machine Image (AMI) from the existing EC2 instance Use the AMI toprovision new instances behind an Application Load Balancer as part of an Auto Scalinggroup. Configure the Auto Scaling group to maintain a minimum of two instances.Configure an accelerator in AWS Global Accelerator for the website.
E. Create an Amazon Machine Image (AMI) from the existing EC2 instance. Use the AMI toprovision new instances behind an Application Load Balancer as part of an Auto Scalinggroup. Configure the Auto Scaling group to maintain a minimum of two instances.Configure an Amazon CloudFront distribution for the website.


Question # 153

A company has an on-premises MySQL database that handles transactional data. Thecompany is migrating the database to the AWS Cloud. The migrated database mustmaintain compatibility with the company's applications that use the database. The migrateddatabase also must scale automatically during periods of increased demand.Which migration solution will meet these requirements?

A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL. Configureelastic storage scaling.
B. Migrate the database to Amazon Redshift by using the mysqldump utility. Turn on AutoScaling for the Amazon Redshift cluster.
C. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonAurora. Turn on Aurora Auto Scaling.
D. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonDynamoDB. Configure an Auto Scaling policy.


Question # 154

A company is using AWS Key Management Service (AWS KMS) keys to encrypt AWSLambda environment variables. A solutions architect needs to ensure that the requiredpermissions are in place to decrypt and use the environment variables.Which steps must the solutions architect take to implement the correct permissions?(Choose two.)

A. Add AWS KMS permissions in the Lambda resource policy.
B. Add AWS KMS permissions in the Lambda execution role.
C. Add AWS KMS permissions in the Lambda function policy.
D. Allow the Lambda execution role in the AWS KMS key policy.
E. Allow the Lambda resource policy in the AWS KMS key policy.


Question # 155

A company operates an ecommerce website on Amazon EC2 instances behind anApplication Load Balancer (ALB) in an Auto Scaling group. The site is experiencingperformance issues related to a high request rate from illegitimate external systems withchanging IP addresses. The security team is worried about potential DDoS attacks againstthe website. The company must block the illegitimate incoming requests in a way that has aminimal impact on legitimate users.What should a solutions architect recommend?

A. Deploy Amazon Inspector and associate it with the ALB.
B. Deploy AWS WAF, associate it with the ALB, and configure a rate-limiting rule.
C. Deploy rules to the network ACLs associated with the ALB to block the incoming traffic.
D. Deploy Amazon GuardDuty and enable rate-limiting protection when configuring GuardDuty.


Question # 156

A company uses Amazon Elastic Kubernetes Service (Amazon EKS) to run a containerapplication. The EKS cluster stores sensitive information in the Kubernetes secrets object.The company wants to ensure that the information is encryptedWhich solution will meet these requirements with the LEAST operational overhead?

A. Use the container application to encrypt the information by using AWS Key ManagementService (AWS KMS).
B. Enable secrets encryption in the EKS cluster by using AWS Key Management Service(AWS KMS)_
C. Implement an AWS Lambda tuncüon to encrypt the information by using AWS KeyManagement Service (AWS KMS).
D. use AWS Systems Manager Parameter Store to encrypt the information by using AWSKey Management Service (AWS KMS).


Question # 157

A company runs a three-tier application in two AWS Regions. The web tier, the applicationtier, and the database tier run on Amazon EC2 instances. The company uses AmazonRDS for Microsoft SQL Server Enterprise for the database tier The database tier isexperiencing high load when weekly and monthly reports are run. The company wants toreduce the load on the database tier.Which solution will meet these requirements with the LEAST administrative effort?

A. Create read replicas. Configure the reports to use the new read replicas.
B. Convert the RDS database to Amazon DynamoDB_ Configure the reports to useDynamoDB
C. Modify the existing RDS DB instances by selecting a larger instance size.
D. Modify the existing ROS DB instances and put the instances into an Auto Scaling group.


Question # 158

A company is building an ecommerce application and needs to store sensitive customerinformation. The company needs to give customers the ability to complete purchasetransactions on the website. The company also needs to ensure that sensitive customerdata is protected, even from database administrators.Which solution meets these requirements?

A. Store sensitive data in an Amazon Elastic Block Store (Amazon EBS) volume. Use EBSencryption to encrypt the data. Use an IAM instance role to restrict access.
B. Store sensitive data in Amazon RDS for MySQL. Use AWS Key Management Service(AWS KMS) client-side encryption to encrypt the data.
C. Store sensitive data in Amazon S3. Use AWS Key Management Service (AWS KMS)server-side encryption to encrypt the data. Use S3 bucket policies to restrict access.
D. Store sensitive data in Amazon FSx for Windows Server. Mount the file share onapplication servers. Use Windows file permissions to restrict access.


Question # 159

A company is moving its data and applications to AWS during a multiyear migration project.The company wants to securely access data on Amazon S3 from the company's AWSRegion and from the company's on-premises location. The data must not traverse theinternet. The company has established an AWS Direct Connect connection between itsRegion and its on-premises locationWhich solution will meet these requirements?

A. Create gateway endpoints for Amazon S3. Use the gateway endpoints to securelyaccess the data from the Region and the on-premises location.
B. Create a gateway in AWS Transit Gateway to access Amazon S3 securely from theRegion and the on-premises location.
C. Create interface endpoints for Amazon S3_ Use the interface endpoints to securelyaccess the data from the Region and the on-premises location.
D. Use an AWS Key Management Service (AWS KMS) key to access the data securelyfrom the Region and the on-premises location.


Question # 160

A company has a financial application that produces reports. The reports average 50 KB insize and are stored in Amazon S3. The reports are frequently accessed during the firstweek after production and must be stored for several years. The reports must beretrievable within 6 hours.Which solution meets these requirements MOST cost-effectively?

A. Use S3 Standard. Use an S3 Lifecycle rule to transition the reports to S3 Glacier after 7days.
B. Use S3 Standard. Use an S3 Lifecycle rule to transition the reports to S3 Standard-Infrequent Access (S3 Standard-IA) after 7 days.
C. Use S3 Intelligent-Tiering. Configure S3 Intelligent-Tiering to transition the reports to S3Standard-Infrequent Access (S3 Standard-IA) and S3 Glacier.
D. Use S3 Standard. Use an S3 Lifecycle rule to transition the reports to S3 Glacier DeepArchive after 7 days.


Question # 161

A company needs to store contract documents. A contract lasts for 5 years. During the 5-year period, the company must ensure that the documents cannot be overwritten ordeleted. The company needs to encrypt the documents at rest and rotate the encryptionkeys automatically every year.Which combination of steps should a solutions architect take to meet these requirementswith the LEAST operational overhead? (Select TWO.)

A. Store the documents in Amazon S3. Use S3 Object Lock in governance mode.
B. Store the documents in Amazon S3. Use S3 Object Lock in compliance mode.
C. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3).Configure key rotation.
D. Use server-side encryption with AWS Key Management Service (AWS KMS) customermanaged keys. Configure key rotation.
E. Use server-side encryption with AWS Key Management Service (AWS KMS) customerprovided (imported) keys. Configure key rotation.


Question # 162

A company hosts an internal serverless application on AWS by using Amazon APIGateway and AWS Lambda. The company's employees report issues with high latencywhen they begin using the application each day. The company wants to reduce latency.Which solution will meet these requirements?

A. Increase the API Gateway throttling limit.
B. Set up a scheduled scaling to increase Lambda provisioned concurrency beforeemployees begin to use the application each day.
C. Create an Amazon CloudWatch alarm to initiate a Lambda function as a target for thealarm at the beginning of each day.
D. Increase the Lambda function memory.


Question # 163

A company offers a food delivery service that is growing rapidly. Because of the growth, thecompany’s order processing system is experiencing scaling problems during peak traffichours. The current architecture includes the following:• A group of Amazon EC2 instances that run in an Amazon EC2 Auto Scaling group tocollect orders from the application• Another group of EC2 instances that run in an Amazon EC2 Auto Scaling group to fulfillordersThe order collection process occurs quickly, but the order fulfillment process can takelonger. Data must not be lost because of a scaling event.A solutions architect must ensure that the order collection process and the order fulfillmentprocess can both scale properly during peak traffic hours. The solution must optimizeutilization of the company’s AWS resources.Which solution meets these requirements?

A. Use Amazon CloudWatch metrics to monitor the CPU of each instance in the AutoScaling groups. Configure each Auto Scaling group’s minimum capacity according to peakworkload values.
B. Use Amazon CloudWatch metrics to monitor the CPU of each instance in the AutoScaling groups. Configure a CloudWatch alarm to invoke an Amazon Simple NotificationService (Amazon SNS) topic that creates additional Auto Scaling groups on demand.
C. Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for ordercollection and another for order fulfillment. Configure the EC2 instances to poll theirrespective queue. Scale the Auto Scaling groups based on notifications that the queuessend.
D. Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for ordercollection and another for order fulfillment. Configure the EC2 instances to poll theirrespective queue. Create a metric based on a backlog per instance calculation. Scale theAuto Scaling groups based on this metric.


Question # 164

A company sends AWS CloudTrail logs from multiple AWS accounts to an Amazon S3bucket in a centralized account. The company must keep the CloudTrail logs. Thecompany must also be able to query the CloudTrail logs at any timeWhich solution will meet these requirements?

A. Use the CloudTraiI event history in the centralized account to create an Amazon Athenatable. Query the CloudTrail logs from Athena.
B. Configure an Amazon Neptune instance to manage the CloudTrail logs. Query theCloudTraiI logs from Neptune.
C. Configure CloudTrail to send the logs to an Amazon DynamoDB table. Create adashboard in Amazon QulCkSight to query the logs in the table.
D. use Amazon Athena to create an Athena notebook. Configure CloudTrail to send thelogs to the notebook. Run queries from Athena.


Question # 165

A company stores its data on premises. The amount of data is growing beyond thecompany's available capacity.The company wants to migrate its data from the on-premises location to an Amazon S3bucket The company needs a solution that will automatically validate the integrity of thedata after the transferWhich solution will meet these requirements?

A. Order an AWS Snowball Edge device Configure the Snowball Edge device to performthe online data transfer to an S3 bucket.
B. Deploy an AWS DataSync agent on premises. Configure the DataSync agent to performthe online data transfer to an S3 bucket.
C. Create an Amazon S3 File Gateway on premises. Configure the S3 File Gateway toperform the online data transfer to an S3 bucket
D. Configure an accelerator in Amazon S3 Transfer Acceleration on premises. Configurethe accelerator to perform the online data transfer to an S3 bucket.


Question # 166

A company is building a RESTful serverless web application on AWS by using Amazon APIGateway and AWS Lambda. The users of this web application will be geographicallydistributed, and the company wants to reduce the latency of API requests to these usersWhich type of endpoint should a solutions architect use to meet these requirements?

A. Private endpoint
B. Regional endpoint
C. Interface VPC endpoint
D. Edge-optimzed endpoint


Question # 167

A company runs multiple Amazon EC2 Linux instances in a VPC across two AvailabilityZones. The instances host applications that use a hierarchical directory structure. Theapplications need to read and write rapidly and concurrently to shared storage.What should a solutions architect do to meet these requirements?

A. Create an Amazon S3 bucket. Allow access from all the EC2 instances in the VPC. B. Create an Amazon Elastic File System (Amazon EFS) file system. Mount the EFS filesystem from each EC2 instance.
C. Create a file system on a Provisioned IOPS SSD (102) Amazon Elastic Block Store(Amazon EBS) volume. Attach the EBS volume to all the EC2 instances.
D. Create file systems on Amazon Elastic Block Store (Amazon EBS) volumes that areattached to each EC2 instance. Synchromze the EBS volumes across the different EC2instances.


Question # 168

A solutions architect is using an AWS CloudFormation template to deploy a three-tier web application. The web application consists of a web tier and an application tier that storesand retrieves user data in Amazon DynamoDB tables. The web and application tiers arehosted on Amazon EC2 instances, and the database tier is not publicly accessible. Theapplication EC2 instances need to access the DynamoDB tables without exposing APIcredentials in the template.What should the solutions architect do to meet these requirements?

A. Create an IAM role to read the DynamoDB tables. Associate the role with the applicationinstances by referencing an instance profile.
B. Create an IAM role that has the required permissions to read and write from theDynamoDB tables. Add the role to the EC2 instance profile, and associate the instanceprofile with the application instances.
C. Use the parameter section in the AWS CloudFormation template to have the user inputaccess and secret keys from an already-created IAM user that has the requiredpermissions to read and write from the DynamoDB tables.
D. Create an IAM user in the AWS CloudFormation template that has the requiredpermissions to read and write from the DynamoDB tables. Use the GetAtt function toretrieve the access and secret keys, and pass them to the application instances throughthe user data.


Question # 169

A solutions architect is designing the storage architecture for a new web application usedfor storing and viewing engineering drawings. All application components will be deployedon the AWS infrastructure.The application design must support caching to minimize the amount of time that users waitfor the engineering drawings to load. The application must be able to store petabytes ofdata. Which combination of storage and caching should the solutions architect use?

A. Amazon S3 with Amazon CloudFront
B. Amazon S3 Glacier with Amazon ElastiCache
C. Amazon Elastic Block Store (Amazon EBS) volumes with Amazon CloudFront
D. AWS Storage Gateway with Amazon ElastiCache


Question # 170

A company's website handles millions of requests each day, and the number of requestscontinues to increase. A solutions architect needs to improve the response time of the webapplication. The solutions architect determines that the application needs to decreaselatency when retrieving product details from theAmazon DynamoDB table.Which solution will meet these requirements with the LEAST amount of operationaloverhead?

A. Set up a DynamoDB Accelerator (DAX) cluster. Route all read requests through DAX.
B. Set up Amazon ElastiCache for Redis between the DynamoDB table and the webapplication. Route all read requests through Redis.
C. Set up Amazon ElastiCache for Memcached between the DynamoDB table and the webapplication. Route all read requests through Memcached.
D. Set up Amazon DynamoDB Streams on the table, and have AWS Lambda read from thetable and populate Amazon ElastiCache. Route all read requests through ElastiCache.


Question # 171

A social media company is building a feature for its website. The feature will give users the ability to upload photos. The company expects significant increases in demand during large events and must ensure that the website can handle the upload traffic from users. Which solution meets these requirements with the MOST scalability?

A. Upload files from the user's browser to the application servers. Transfer the files to an Amazon S3 bucket.
B. Provision an AWS Storage Gateway file gateway. Upload files directly from the user'sbrowser to the file gateway.
C. Generate Amazon S3 presigned URLs in the application. Upload files directly from theuser's browser into an S3 bucket.
D. Provision an Amazon Elastic File System (Amazon EFS) file system Upload files directlyfrom the user's browser to the file system


Question # 172

A company hosts an application on Amazon EC2 instances that run in a single AvailabilityZone. The application is accessible by using the transport layer of the Open SystemsInterconnection (OSI) model. The company needs the application architecture to have highavailabilityWhich combination of steps will meet these requirements MOST cost-effectively? (SelectTWO_)

A. Configure new EC2 instances in a different AvailabiIity Zone. Use Amazon Route 53 toroute traffic to all instances.
B. Configure a Network Load Balancer in front of the EC2 instances.
C. Configure a Network Load Balancer tor TCP traffic to the instances. Configure anApplication Load Balancer tor HTTP and HTTPS traffic to the instances.
D. Create an Auto Scaling group for the EC2 instances. Configure the Auto Scaling groupto use multiple Availability Zones. Configure the Auto Scaling group to run applicationhealth checks on the instances_
E. Create an Amazon CloudWatch alarm. Configure the alarm to restart EC2 instances thattransition to a stopped state


Question # 173

A company has a service that reads and writes large amounts of data from an Amazon S3bucket in the same AWS Region. The service is deployed on Amazon EC2 instances withinthe private subnet of a VPC. The service communicates with Amazon S3 over a NATgateway in the public subnet. However, the company wants a solution that will reduce thedata output costs.Which solution will meet these requirements MOST cost-effectively?

A. Provision a dedicated EC2 NAT instance in the public subnet. Configure the route tablefor the private subnet to use the elastic network interface of this instance as the destinationfor all S3 traffic.
B. Provision a dedicated EC2 NAT instance in the private subnet. Configure the route tablefor the public subnet to use the elastic network interface of this instance as the destinationfor all S3 traffic.
C. Provision a VPC gateway endpoint. Configure the route table for the private subnet touse the gateway endpoint as the route for all S3 traffic.
D. Provision a second NAT gateway. Configure the route table for the private subnet to usethis NAT gateway as the destination for all S3 traffic.


Question # 174

A company is planning to use an Amazon DynamoDB table for data storage. The companyis concerned about cost optimization. The table will not be used on most mornings. In theevenings, the read and write traffic will often be unpredictable. When traffic spikes occur,they will happen very quickly.What should a solutions architect recommend?

A. Create a DynamoDB table in on-demand capacity mode.
B. Create a DynamoDB table with a global secondary inde
C. Create a DynamoDB table with provisioned capacity and auto scaling.
D. Create a DynamoDB table in provisioned capacity mode, and configure it as a global table


Question # 175

A manufacturing company has machine sensors that upload .csv files to an Amazon S3 bucket. These .csv files must be converted into images and must be made available as soon as possible for the automatic generation of graphical reports. The images become irrelevant after 1 month, but the .csv files must be kept to train machine learning (ML) models twice a year. The ML trainings and audits are planned weeks in advance. Which combination of steps will meet these requirements MOST cost-effectively? (Select TWO.)

A. Launch an Amazon EC2 Spot Instance that downloads the .csv files every hour, generates the image files, and uploads the images to the S3 bucket.
B. Design an AWS Lambda function that converts the .csv files into images and stores the images in the S3 bucket. Invoke the Lambda function when a .csv file is uploaded. 
C. Create S3 Lifecycle rules for .csv files and image files in the S3 bucket. Transition the .csv files from S3 Standard to S3 Glacier 1 day after they are uploaded. Expire the image files after 30 days. 
D. Create S3 Lifecycle rules for .csv files and image files in the S3 bucket. Transition the .csv files from S3 Standard to S3 One Zone-Infrequent Access (S3 One Zone-IA) 1 day after they are uploaded. Expire the image files after 30 days. 
E. Create S3 Lifecycle rules for .csv files and image files in the S3 bucket. Transition the .csv files from S3 Standard to S3 Standard-Infrequent Access (S3 Standard-IA) 1 day after they are uploaded. Keep the image files in Reduced Redundancy Storage (RRS). 


Question # 176

A company wants to move from many standalone AWS accounts to a consolidated, multiaccount architecture The company plans to create many new AWS accounts for different business units. The company needs to authenticate access to these AWS accounts by using a centralized corporate directory service. Which combination of actions should a solutions architect recommend to meet these requirements? (Select TWO.)

A. Create a new organization in AWS Organizations with all features turned on. Create the new AWS accounts in the organization. 
B. Set up an Amazon Cognito identity pool. Configure AWS 1AM Identity Center (AWS Single Sign-On) to accept Amazon Cognito authentication. 
C. Configure a service control policy (SCP) to manage the AWS accounts. Add AWS 1AM Identity Center (AWS Single Sign-On) to AWS Directory Service. 
D. Create a new organization in AWS Organizations. Configure the organization's authentication mechanism to use AWS Directory Service directly. 
E. Set up AWS 1AM Identity Center (AWS Single Sign-On) in the organization. Configure 1AM Identity Center, and integrate it with the company's corporate directory service. 


Question # 177

A company is deploying a new application on Amazon EC2 instances. The application writes data to Amazon Elastic Block Store (Amazon EBS) volumes. The company needs to ensure that all data that is written to the EBS volumes is encrypted at rest. Which solution will meet this requirement?

A. Create an 1AM role that specifies EBS encryption. Attach the role to the EC2 instances. 
B. Create the EBS volumes as encrypted volumes. Attach the EBS volumes to the EC2 instances 
C. Create an EC2 instance tag that has a key of Encrypt and a value of True. Tag all instances that require encryption at the EBS level. 
D. Create an AWS Key Management Service (AWS KMS) key policy that enforces EBS encryption in the account. Ensure that the key policy is active 


Question # 178

A serverless application uses Amazon API Gateway. AWS Lambda, and Amazon DynamoDB. The Lambda function needs permissions to read and write to the DynamoDB table. Which solution will give the Lambda function access to the DynamoDB table MOST securely?

A. Create an 1AM user with programmatic access to the Lambda function. Attach a policy to the user that allows read and write access to the DynamoDB table. Store the access_key_id and secret_access_key parameters as part of the Lambda environment variables. Ensure that other AWS users do not have read and write access to the Lambda function configuration 
B. Create an 1AM role that includes Lambda as a trusted service. Attach a policy to the role that allows read and write access to the DynamoDB table. Update the configuration of the Lambda function to use the new role as the execution role. 
C. Create an 1AM user with programmatic access to the Lambda function. Attach a policy to the user that allows read and write access to the DynamoDB table. Store the access_key_id and secret_access_key parameters in AWS Systems Manager Parameter Store as secure string parameters. Update the Lambda function code to retrieve the secure string parameters before connecting to the DynamoDB table. 
D. Create an 1AM role that includes DynamoDB as a trusted service. Attach a policy to the role that allows read and write access from the Lambda function. Update the code of the Lambda function to attach to the new role as an execution role. 


Question # 179

A company wants to use artificial intelligence (Al) to determine the quality of its customer service calls. The company currently manages calls in four different languages, including English. The company will offer new languages in the future. The company does not have the resources to regularly maintain machine learning (ML) models. The company needs to create written sentiment analysis reports from the customer service call recordings. The customer service call recording text must be translated into English. Which combination of steps will meet these requirements? (Select THREE.)

A. Use Amazon Comprehend to translate the audio recordings into English.
 B. Use Amazon Lex to create the written sentiment analysis reports. 
C. Use Amazon Polly to convert the audio recordings into text. 
D. Use Amazon Transcribe to convert the audio recordings in any language into text.
 E. Use Amazon Translate to translate text in any language to English.
 F. Use Amazon Comprehend to create the sentiment analysis reports. 


Question # 180

A company needs to configure a real-time data ingestion architecture for its application. The company needs an API. a process that transforms data as the data is streamed, and a storage solution for the data. Which solution will meet these requirements with the LEAST operational overhead? 

A. Deploy an Amazon EC2 instance to host an API that sends data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3. 
B. Deploy an Amazon EC2 instance to host an API that sends data to AWS Glue. Stop source/destination checking on the EC2 instance. Use AWS Glue to transform the data and to send the data to Amazon S3. 
C. Configure an Amazon API Gateway API to send data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3. 
D. Configure an Amazon API Gateway API to send data to AWS Glue. Use AWS Lambda functions to transform the data. Use AWS Glue to send the data to Amazon S3.


Question # 181

A 4-year-old media company is using the AWS Organizations all features feature set fo organize its AWS accounts. According to he company's finance team, the billing information on the member accounts must not be accessible to anyone, including the root user of the member accounts. Which solution will meet these requirements?

A. Add all finance team users to an IAM group. Attach an AWS managed policy named Billing to the group. 
B. Attach an identity-based policy to deny access to the billing information to all users, including the root user. 
C. Create a service control policy (SCP) to deny access to the billing information. Attach the SCP to the root organizational unit (OU). 
D. Convert from the Organizations all features feature set to the Organizations consolidated billing feature set. 


Question # 182

A company uses on-premises servers to host its applications The company is running out of storage capacity. The applications use both block storage and NFS storage. The company needs a high-performing solution that supports local caching without rearchitecting its existing applications. Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.)

A. Mount Amazon S3 as a file system to the on-premises servers. 
B. Deploy an AWS Storage Gateway file gateway to replace NFS storage.
 C. Deploy AWS Snowball Edge to provision NFS mounts to on-premises servers. 
D. Deploy an AWS Storage Gateway volume gateway to replace the block storage 
E. Deploy Amazon Elastic File System (Amazon EFS) volumes and mount them to onpremises servers. 


Question # 183

A company operates a two-tier application for image processing. The application uses two Availability Zones, each with one public subnet and one private subnet. An Application Load Balancer (ALB) for the web tier uses the public subnets. Amazon EC2 instances for the application tier use the private subnets. Users report that the application is running more slowly than expected. A security audit of the web server log files shows that the application is receiving millions of illegitimate requests from a small number of IP addresses. A solutions architect needs to resolve the immediate performance problem while the company investigates a more permanent solution. What should the solutions architect recommend to meet this requirement?

A. Modify the inbound security group for the web tier. Add a deny rule for the IP addresses that are consuming resources. 
B. Modify the network ACL for the web tier subnets. Add an inbound deny rule for the IP addresses that are consuming resources 
C. Modify the inbound security group for the application tier. Add a deny rule for the IP addresses that are consuming resources. 
D. Modify the network ACL for the application tier subnets. Add an inbound deny rule for the IP addresses that are consuming resources 


Question # 184

A gaming company uses Amazon DynamoDB to store user information such as geographic location, player data, and leaderboards. The company needs to configure continuous backups to an Amazon S3 bucket with a minimal amount of coding. The backups must not affect availability of the application and must not affect the read capacity units (RCUs) that are defined for the table Which solution meets these requirements?

A. Use an Amazon EMR cluster. Create an Apache Hive job to back up the data to Amazon S3. 
B. Export the data directly from DynamoDB to Amazon S3 with continuous backups. Turn on point-in-time recovery for the table. 
C. Configure Amazon DynamoDB Streams. Create an AWS Lambda function to consume the stream and export the data to an Amazon S3 bucket. 
D. Create an AWS Lambda function to export the data from the database tables to Amazon S3 on a regular basis. Turn on point-in-time recovery for the table. 


Question # 185

A company has an on-premises server that uses an Oracle database to process and store customer information The company wants to use an AWS database service to achieve higher availability and to improve application performance. The company also wants to offload reporting from its primary database system. Which solution will meet these requirements in the MOST operationally efficient way?

A. Use AWS Database Migration Service (AWS DMS) to create an Amazon RDS DB instance in multiple AWS Regions Point the reporting functions toward a separate DB instance from the primary DB instance. 
B. Use Amazon RDS in a Single-AZ deployment to create an Oracle database Create a read replica in the same zone as the primary DB instance. Direct the reporting functions to the read replica. 
C. Use Amazon RDS deployed in a Multi-AZ cluster deployment to create an Oracle database Direct the reporting functions to use the reader instance in the cluster deployment 
D. Use Amazon RDS deployed in a Multi-AZ instance deployment to create an Amazon Aurora database. Direct the reporting functions to the reader instances. 


Question # 186

A company has a small Python application that processes JSON documents and outputs the results to an on-premises SQL database. The application runs thousands of times each day. The company wants to move the application to the AWS Cloud. The company needs a highly available solution that maximizes scalability and minimizes operational overhead. Which solution will meet these requirements? 

A. Place the JSON documents in an Amazon S3 bucket. Run the Python code on multiple Amazon EC2 instances to process the documents. Store the results in an Amazon Aurora DB cluster 
B. Place the JSON documents in an Amazon S3 bucket. Create an AWS Lambda function that runs the Python code to process the documents as they arrive in the S3 bucket. Store the results in an Amazon Aurora DB cluster. 
C. Place the JSON documents in an Amazon Elastic Block Store (Amazon EBS) volume. Use the EBS Multi-Attach feature to attach the volume to multiple Amazon EC2 instances. Run the Python code on the EC2 instances to process the documents. Store the results on an Amazon RDS DB instance. 
D. Place the JSON documents in an Amazon Simple Queue Service (Amazon SQS) queue as messages Deploy the Python code as a container on an Amazon Elastic Container Service (Amazon ECS) cluster that is configured with the Amazon EC2 launch type. Use the container to process the SQS messages. Store the results on an Amazon RDS DB instance. 


Question # 187

A company wants lo build a web application on AWS. Client access requests to the website are not predictable and can be idle for a long time. Only customers who have paid a subscription fee can have the ability to sign in and use the web application. Which combination of steps will meet these requirements MOST cost-effectively? (Select THREE.)

A. Create an AWS Lambda function to retrieve user information from Amazon DynamoDB. Create an Amazon API Gateway endpoint to accept RESTful APIs. Send the API calls to the Lambda function. 
B. Create an Amazon Elastic Container Service (Amazon ECS) service behind an Application Load Balancer to retrieve user information from Amazon RDS. Create an Amazon API Gateway endpoint to accept RESTful APIs. Send the API calls to the Lambda function. 
C. Create an Amazon Cogmto user pool to authenticate users D. Create an Amazon Cognito identity pool to authenticate users.
 E. Use AWS Amplify to serve the frontend web content with HTML. CSS, and JS. Use an integrated Amazon CloudFront configuration. 
F. Use Amazon S3 static web hosting with PHP. CSS. and JS. Use Amazon CloudFront to serve the frontend web content. 


Question # 188

A company runs a three-tier web application in the AWS Cloud that operates across three Availability Zones. The application architecture has an Application Load Balancer, an Amazon EC2 web server that hosts user session states, and a MySQL database that runs on an EC2 instance. The company expects sudden increases in application traffic. The company wants to be able to scale to meet future application capacity demands and to ensure high availability across all three Availability Zones. Which solution will meet these requirements?

A. Migrate the MySQL database to Amazon RDS for MySQL with a Multi-AZ DB cluster deployment. Use Amazon ElastiCache for Redis with high availability to store session data and to cache reads. Migrate the web server to an Auto Scaling group that is in three Availability Zones. 
B. Migrate the MySQL database to Amazon RDS for MySQL with a Multi-AZ DB cluster deployment. Use Amazon ElastiCache for Memcached with high availability to store session data and to cache reads. Migrate the web server to an Auto Scaling group that is in three Availability Zones. 
C. Migrate the MySQL database to Amazon DynamoDB. Use DynamoDB Accelerator (DAX) to cache reads. Store the session data in DynamoDB. Migrate the web server to an Auto Scaling group that is in three Availability Zones. 
D. Migrate the MySQL database to Amazon RDS for MySQL in a single Availability Zone. Use Amazon ElastiCache for Redis with high availability to store session data and to cache reads. Migrate the web server to an Auto Scaling group that is in three Availability Zones. 


Question # 189

A company runs a website that uses a content management system (CMS) on Amazon EC2. The CMS runs on a single EC2 instance and uses an Amazon Aurora MySQL MultiAZ DB instance for the data tier. Website images are stored on an Amazon Elastic Block Store (Amazon EBS) volume that is mounted inside the EC2 instance. Which combination of actions should a solutions architect take to improve the performance and resilience of the website? (Select TWO.)

A. Move the website images into an Amazon S3 bucket that is mounted on every EC2 instance. 
B. Share the website images by using an NFS share from the primary EC2 instance. Mount this share on the other EC2 instances. 
C. Move the website images onto an Amazon Elastic File System (Amazon EFS) file system that is mounted on every EC2 instance. 
D. Create an Amazon Machine Image (AMI) from the existing EC2 instance Use the AMI to provision new instances behind an Application Load Balancer as part of an Auto Scaling group. Configure the Auto Scaling group to maintain a minimum of two instances. Configure an accelerator in AWS Global Accelerator for the website. 
E. Create an Amazon Machine Image (AMI) from the existing EC2 instance. Use the AMI to provision new instances behind an Application Load Balancer as part of an Auto Scaling group. Configure the Auto Scaling group to maintain a minimum of two instances. Configure an Amazon CloudFront distribution for the website.


Question # 190

A company moved its on-premises PostgreSQL database to an Amazon RDS for PostgreSQL DB instance. The company successfully launched a new product. The workload on the database has increased. The company wants to accommodate the larger workload without adding infrastructure. Which solution will meet these requirements MOST cost-effectively?

A. Buy reserved DB instances for the total workload. Make the Amazon RDS for PostgreSQL DB instance larger. 
B. Make the Amazon RDS for PostgreSQL DB instance a Multi-AZ DB instance. 
C. Buy reserved DB instances for the total workload. Add another Amazon RDS for PostgreSQL DB instance. 
D. Make the Amazon RDS for PostgreSQL DB instance an on-demand DB instance. 


Question # 191

A company's data platform uses an Amazon Aurora MySQL database. The database has multiple read replicas and multiple DB instances across different Availability Zones. Users have recently reported errors from the database that indicate that there are too many connections. The company wants to reduce the failover time by 20% when a read replica is promoted to primary writer. Which solution will meet this requirement?  

A. Switch from Aurora to Amazon RDS with Multi-AZ cluster deployment. 
B. Use Amazon RDS Proxy in front of the Aurora database. 
C. Switch to Amazon DynamoDB with DynamoDB Accelerator (DAX) for read connections. 
D. Switch to Amazon Redshift with relocation capability. 


Question # 192

A company uses Amazon EC2 instances to host its internal systems. As part of a deployment operation, an administrator tries to use the AWS CLI to terminate an EC2 instance. However, the administrator receives a 403 (Access Denied) error message. The administrator is using an IAM role that has the following IAM policy attached: What is the cause of the unsuccessful request?

A. The EC2 instance has a resource-based policy with a Deny statement. 
B. The principal has not been specified in the policy statement
C. The "Action" field does not grant the actions that are required to terminate the EC2 instance. 
D. The request to terminate the EC2 instance does not originate from the CIDR blocks 192.0.2.0/24 or 203.0 113.0/24 


Question # 193

A company uses Amazon API Gateway to run a private gateway with two REST APIs in the same VPC. The BuyStock RESTful web service calls the CheckFunds RESTful web service to ensure that enough funds are available before a stock can be purchased. The company has noticed in the VPC flow logs that the BuyStock RESTful web service calls the CheckFunds RESTful web service over the internet instead of through the VPC. A solutions architect must implement a solution so that the APIs communicate through the VPC. Which solution will meet these requirements with the FEWEST changes to the code? (Select Correct Option/s and give detailed explanation from AWS Certified Solutions Architect - Associate (SAA-C03) Study Manual or documents) 

A. Add an X-APl-Key header in the HTTP header for authorization. 
B. Use an interface endpoint. 
C. Use a gateway endpoint. 
D. Add an Amazon Simple Queue Service (Amazon SQS) queue between the two REST APIs. 


Question # 194

A company has multiple Windows file servers on premises. The company wants to migrate and consolidate its files into an Amazon FSx for Windows File Server file system. File permissions must be preserved to ensure that access rights do not change. Which solutions will meet these requirements? (Select TWO.) 

A. Deploy AWS DataSync agents on premises. Schedule DataSync tasks to transfer the data to the FSx for Windows File Server file system. 
B. Copy the shares on each file server into Amazon S3 buckets by using the AWS CLI Schedule AWS DataSync tasks to transfer the data to the FSx for Windows File Server file system. 
C. Remove the drives from each file server Ship the drives to AWS for import into Amazon S3. Schedule AWS DataSync tasks to transfer the data to the FSx for Windows File Server file system 
D. Order an AWS Snowcone device. Connect the device to the on-premises network. Launch AWS DataSync agents on the device. Schedule DataSync tasks to transfer the data to the FSx for Windows File Server file system, 
E. Order an AWS Snowball Edge Storage Optimized device. Connect the device to the onpremises network. Copy data to the device by using the AWS CLI. Ship the device back to AWS for import into Amazon S3. Schedule AWS DataSync tasks to transfer the data to the FSx for Windows File Server file system. 


Question # 195

A company is running a microservices application on Amazon EC2 instances. The company wants to migrate the application to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster for scalability. The company must configure the Amazon EKS control plane with endpoint private access set to true and endpoint public access set to false to maintain security compliance The company must also put the data plane in private subnets. However, the company has received error notifications because the node cannot join the cluster. Which solution will allow the node to join the cluster?

A. Grant the required permission in AWS Identity and Access Management (1AM) to the AmazonEKSNodeRole 1AM role. 
B. Create interface VPC endpoints to allow nodes to access the control plane. 
C. Recreate nodes in the public subnet Restrict security groups for EC2 nodes 
D. Allow outbound traffic in the security group of the nodes. 


Question # 196

A company wants to create an application to store employee data in a hierarchical structured relationship. The company needs a minimum-latency response to high-traffic queries for the employee data and must protect any sensitive data. The company also needs to receive monthly email messages if any financial information is present in the employee data. Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)

A. Use Amazon Redshift to store the employee data in hierarchies. Unload the data to Amazon S3 every month. 
B. Use Amazon DynamoDB to store the employee data in hierarchies. Export the data to Amazon S3 every month. 
C. Configure Amazon fvlacie for the AWS account. Integrate Macie with Amazon EventBridge to send monthly events to AWS Lambda. 
D. Use Amazon Athena to analyze the employee data in Amazon S3. Integrate Athena with Amazon QuickSight to publish analysis dashboards and share the dashboards with users. 
E. Configure Amazon Macie for the AWS account Integrate Macie with Amazon EventBridge to send monthly notifications through an Amazon Simple Notification Service (Amazon SNS) subscription. 


Question # 197

A company wants to use high-performance computing and artificial intelligence to improve its fraud prevention and detection technology. The company requires distributed processing to complete a single workload as quickly as possible. Which solution will meet these requirements?

A. Use Amazon Elastic Kubernetes Service (Amazon EKS) and multiple containers. 
B. Use AWS ParallelCluster and the Message Passing Interface (MPI) libraries. 
C. Use an Application Load Balancer and Amazon EC2 instances. 
D. Use AWS Lambda functions. 


Question # 198

A company runs container applications by using Amazon Elastic Kubernetes Service (Amazon EKS) and the Kubernetes Horizontal Pod Autoscaler. The workload is not consistent throughout the day. A solutions architect notices that the number of nodes does not automatically scale out when the existing nodes have reached maximum capacity in the cluster, which causes performance issues Which solution will resolve this issue with the LEAST administrative overhead?

A. Scale out the nodes by tracking the memory usage 
B. Use the Kubernetes Cluster Autoscaler to manage the number of nodes in the cluster. 
C. Use an AWS Lambda function to resize the EKS cluster automatically. 
D. Use an Amazon EC2 Auto Scaling group to distribute the workload.


Question # 199

A global marketing company has applications that run in the ap-southeast-2 Region and the eu-west-1 Region. Applications that run in a VPC in eu-west-1 need to communicate securely with databases that run in a VPC in ap-southeast-2. Which network design will meet these requirements?

A. Create a VPC peering connection between the eu-west-1 VPC and the ap-southeast-2 VPC. Create an inbound rule in the eu-west-1 application security group that allows traffic from the database server IP addresses in the ap-southeast-2 security group. 
B. Configure a VPC peering connection between the ap-southeast-2 VPC and the eu-west1 VPC. Update the subnet route tables. Create an inbound rule in the ap-southeast-2 database security group that references the security group ID of the application servers in eu-west-1. 
C. Configure a VPC peering connection between the ap-southeast-2 VPC and the eu-west1 VPC. Update the subnet route tables Create an inbound rule in the ap-southeast-2 database security group that allows traffic from the eu-west-1 application server IP addresses. 
D. Create a transit gateway with a peering attachment between the eu-west-1 VPC and the ap-southeast-2 VPC. After the transit gateways are properly peered and routing is configured, create an inbound rule in the database security group that references the security group ID of the application servers in eu-west-1. 


Question # 200

A company migrated a MySQL database from the company's on-premises data center to an Amazon RDS for MySQL DB instance. The company sized the RDS DB instance to meet the company's average daily workload. Once a month, the database performs slowly when the company runs queries for a report. The company wants to have the ability to run reports and maintain the performance of the daily workloads. Which solution will meet these requirements?

A. Create a read replica of the database. Direct the queries to the read replica. 
B. Create a backup of the database. Restore the backup to another DB instance. Direct the queries to the new database. 
C. Export the data to Amazon S3. Use Amazon Athena to query the S3 bucket. 
D. Resize the DB instance to accommodate the additional workload. 


Question # 201

A company has migrated multiple Microsoft Windows Server workloads to Amazon EC2 instances that run in the us-west-1 Region. The company manually backs up the workloads to create an image as needed. In the event of a natural disaster in the us-west-1 Region, the company wants to recover workloads quickly in the us-west-2 Region. The company wants no more than 24 hours of data loss on the EC2 instances. The company also wants to automate any backups of the EC2 instances. Which solutions will meet these requirements with the LEAST administrative effort? (Select TWO.)

A. Create an Amazon EC2-backed Amazon Machine Image (AMI) lifecycle policy to create a backup based on tags. Schedule the backup to run twice daily. Copy the image on demand. 
B. Create an Amazon EC2-backed Amazon Machine Image (AMI) lifecycle policy to create a backup based on tags. Schedule the backup to run twice daily. Configure the copy to the us-west-2 Region. 
C. Create backup vaults in us-west-1 and in us-west-2 by using AWS Backup. Create a backup plan for the EC2 instances based on tag values. Create an AWS Lambda function to run as a scheduled job to copy the backup data to us-west-2. 
D. Create a backup vault by using AWS Backup. Use AWS Backup to create a backup plan for the EC2 instances based on tag values. Define the destination for the copy as us-west2. Specify the backup schedule to run twice daily.
 E. Create a backup vault by using AWS Backup. Use AWS Backup to create a backup plan for the EC2 instances based on tag values. Specify the backup schedule to run twice daily. Copy on demand to us-west-2.


Question # 202

A company runs a container application by using Amazon Elastic Kubernetes Service (Amazon EKS). The application includes microservices that manage customers and place orders. The company needs to route incoming requests to the appropriate microservices. Which solution will meet this requirement MOST cost-effectively?

A. Use the AWS Load Balancer Controller to provision a Network Load Balancer. 
B. Use the AWS Load Balancer Controller to provision an Application Load Balancer. 
C. Use an AWS Lambda function to connect the requests to Amazon EKS. 
D. Use Amazon API Gateway to connect the requests to Amazon EKS. 


Question # 203

An application uses an Amazon RDS MySQL DB instance. The RDS database is becoming low on disk space. A solutions architect wants to increase the disk space without downtime. Which solution meets these requirements with the LEAST amount of effort?

A. Enable storage autoscaling in RDS. 
B. Increase the RDS database instance size. 
C. Change the RDS database instance storage type to Provisioned IOPS. 
D. Back up the RDS database, increase the storage capacity, restore the database, and stop the previous instance 


Question # 204

A social media company wants to allow its users to upload images in an application that is hosted in the AWS Cloud. The company needs a solution that automatically resizes the images so that the images can be displayed on multiple device types. The application experiences unpredictable traffic patterns throughout the day. The company is seeking a highly available solution that maximizes scalability. What should a solutions architect do to meet these requirements?

A. Create a static website hosted in Amazon S3 that invokes AWS Lambda functions to resize the images and store the images in an Amazon S3 bucket. 
B. Create a static website hosted in Amazon CloudFront that invokes AWS Step Functions to resize the images and store the images in an Amazon RDS database. 
C. Create a dynamic website hosted on a web server that runs on an Amazon EC2 instance Configure a process that runs on the EC2 instance to resize the images and store the images in an Amazon S3 bucket. 
D. Create a dynamic website hosted on an automatically scaling Amazon Elastic Container Service (Amazon ECS) cluster that creates a resize job in Amazon Simple Queue Service (Amazon SQS). Set up an image-resizing program that runs on an Amazon EC2 instance to process the resize jobs 


Question # 205

A retail company uses a regional Amazon API Gateway API for its public REST APIs. The API Gateway endpoint is a custom domain name that points to an Amazon Route 53 alias record. A solutions architect needs to create a solution that has minimal effects on customers and minimal data loss to release the new version of APIs. Which solution will meet these requirements?

A. Create a canary release deployment stage for API Gateway. Deploy the latest API version. Point an appropriate percentage of traffic to the canary stage. After API verification, promote the canary stage to the production stage. 
B. Create a new API Gateway endpoint with a new version of the API in OpenAPI YAML file format. Use the import-to-update operation in merge mode into the API in API Gateway. Deploy the new version of the API to the production stage. 
C. Create a new API Gateway endpoint with a new version of the API in OpenAPI JSON file format. Use the import-to-update operation in overwrite mode into the API in API Gateway. Deploy the new version of the API to the production stage. 
D. Create a new API Gateway endpoint with new versions of the API definitions. Create a custom domain name for the new API Gateway API. Point the Route 53 alias record to the new API Gateway API custom domain name.


Amazon SAA-C03 Frequently Asked Questions


Answer: What is the passing score for the SAA-C03 exam?

The passing score for the SAA-C03 exam is 720 out of 1000.
How many questions are on the SAA-C03 exam?
Answer: The SAA-C03 exam consists of 65 multiple choice and multiple response questions.
What is the time limit for the SAA-C03 exam?
Answer: The time limit for the SAA-C03 exam is 130 minutes.
What are the recommended study materials for the SAA-C03 exam?
Answer: The recommended study materials for the SAA-C03 exam include the AWS Certified Solutions Architect Associate Exam Guide, AWS documentation, white papers, and hands-on experience with AWS services.
Can the SAA-C03 exam be taken online?
Answer: Yes, the SAA-C03 exam is delivered online through the AWS certification platform.
What is the cost of the SAA-C03 exam?
Answer: The cost of the SAA-C03 exam is $150 USD.
What is the format of the SAA-C03 exam?
Answer: The SAA-C03 exam consists of multiple choice and multiple response questions and is delivered in a computer-based format.
How long is the SAA-C03 certification valid for?
Answer: The SAA-C03 certification is valid for three years, after which recertification is required to maintain the certification.
What are the topics covered in the SAA-C03 exam?
The SAA-C03 exam covers topics such as AWS core services, design and deployment of scalable, highly available, and fault-tolerant systems, implementation of security and compliance solutions, and more.
What are the eligibility criteria for taking the SAA-C03 exam?
There are no specific eligibility criteria for taking the SAA-C03 exam. However, it is recommended to have at least one year of experience with the AWS platform, as well as an understanding of AWS services, architecture, security, and billing.
What is the average salary of an AWS Certified Solutions Architect - Associate?
The average salary of an AWS Certified Solutions Architect - Associate varies depending on several factors such as location, industry, and experience. On average, the salary for an AWS Certified Solutions Architect - Associate ranges from $90,000 to $150,000 per year.
What industries commonly use AWS Certified Solutions Architects - Associate?
AWS Certified Solutions Architects - Associate are in high demand across many industries, including technology, finance, healthcare, e-commerce, and more. These professionals are able to design, deploy, and manage scalable and secure cloud-based systems on the AWS platform.
What are the career paths for an AWS Certified Solutions Architect - Associate?
The career paths for an AWS Certified Solutions Architect - Associate can vary depending on their interests and goals. Some common career paths include advancing to an AWS Certified Solutions Architect - Professional, pursuing additional AWS certifications, or moving into management or leadership roles within their organization.
What additional certifications or training can an AWS Certified Solutions Architect - Associate pursue to advance their career?
An AWS Certified Solutions Architect - Associate can pursue additional AWS certifications, such as the AWS Certified Solutions Architect - Professional, AWS Certified DevOps Engineer, or AWS Certified Big Data - Specialty. They can also pursue training in specific AWS services, such as Amazon S3, Amazon EC2, or Amazon RDS.
How does obtaining an AWS Certified Solutions Architect - Associate certification impact one's job prospects and earning potential?
Obtaining an AWS Certified Solutions Architect - Associate certification can positively impact one's job prospects and earning potential. Employers often view AWS certification as a sign of technical expertise and experience, and certified individuals are typically offered higher salaries and more job opportunities.
What are the job duties and responsibilities of an AWS Certified Solutions Architect - Associate?
The job duties and responsibilities of an AWS Certified Solutions Architect - Associate include designing, deploying, and managing scalable, secure, and highly available systems on the AWS platform, evaluating and recommending AWS services for specific business needs, and working with stakeholders to ensure the proper operation and performance of AWS-based systems.
How does the demand for AWS Certified Solutions Architects - Associate vary by region and industry?
The demand for AWS Certified Solutions Architects - Associate varies by region and industry, with higher demand in regions with a strong technology presence and in industries that heavily rely on cloud-based systems.
What are some of the most challenging and rewarding aspects of being an AWS Certified Solutions Architect - Associate?
The most challenging aspect of being an AWS Certified Solutions Architect - Associate is staying current with the rapidly evolving AWS platform and new services and features. The most rewarding aspect is the opportunity to work on exciting and innovative projects, and the satisfaction of delivering solutions that drive business success.
How does continuous education and keeping up with the latest advancements in AWS technology impact the success and growth of an AWS Certified Solutions Architect - Associate?

Continuous education and keeping up with the latest advancements in AWS technology is crucial for the success and growth of an AWS Certified Solutions Architect - Associate. The AWS platform is constantly evolving, and certified professionals
Customers Feedback

What our clients say about SAA-C03 Practice Questions

    Emi     Apr 19, 2024
Hi Guys I am pleased to inform you that I passed my SAA-C03 exam on the first try thanks to these great exam dumps!
    Emily Smith     Apr 18, 2024
I found the SAA-C03 exam to be a comprehensive assessment of my AWS knowledge. The real-world scenarios and practical questions helped me to see how my skills could be applied in a real-world setting. Today i passed my AWS Certified Solutions Architect - Associate (SAA-C03) exam thanks to salesforcexamdumps.com if anyone want to get exam information you can get from here. https://d1.awsstatic.com/training-and-certification/docs-sa-assoc/AWS-Certified-Solutions-Architect-Associate_Exam-Guide.pdf
    Sophia Kim     Apr 18, 2024
I appreciated the format of the SAA-C03 exam dumps, with a mix of multiple-choice and hands-on questions. It was a great way to test both my technical knowledge and practical skills. I got PDF + Exam Engine package and i never found such material before.
    Youssef Abdelhakim     Apr 17, 2024
These questions are helpful for passing the exam, but if you want truly to learn the material.
    Maria Lopez     Apr 17, 2024
I received my SAA-C03 exam results immediately after completing it and was pleasantly surprised with a 92% mark. Truly amazing!
    Rachel Chen     Apr 16, 2024
I thought the SAA-C03 exam was well-structured and gave a good representation of the skills and knowledge necessary to be a successful AWS Solutions Architect. The questions were challenging, but not impossible, which I felt was a good balance.
    Ji-hyun     Apr 16, 2024
Salesforcexamdumps.com Study Material and questions are extremely informative and were a huge help to me. I got 90% marks.
    Michael Brown     Apr 15, 2024
The SAA-C03 exam was a great way to measure my growth as an AWS Solutions Architect. I was pleased to see that all the hard work I put into studying paid off, as I was able to pass the exam on my first try.
    Liam O'Brien     Apr 15, 2024
Compared to other websites, this one is much more affordable and provides the same questions and answers. I received a fantastic score of 90%.
    James Davis     Apr 14, 2024
The SAA-C03 exam dumps was a great way to validate my knowledge of AWS and the various services it offers. I appreciated the mix of technical and practical questions, as it allowed me to showcase my skills in multiple areas. Download AWS Certified Solutions Architect - Associate (SAA-C03) Sample Questions
    Jessica Zhang     Apr 14, 2024
I was nervous about taking the SAA-C03 exam, but after using the practice exams and study material provided my salesforcexamdumps, I felt well-prepared. The questions were a good mix of technical and practical, and I felt confident in my ability to answer them. Overall, it was a great experience and I'm happy to have passed!
    David Lee     Apr 13, 2024
The SAA-C03 exam was definitely a challenge, but I felt well-prepared thanks to the salesforcexamdumps.om AWS Certified Solutions Architect - Associate (SAA-C03) practice exams and study materials I used. I'm so glad I took the time to properly prepare because it paid off with a passing grade.
    Santos     Apr 13, 2024
The exam consisted of 65 questions and 59 questions from this study material. I achieved a mark of 90% on the test. Good luck to those taking the exam!
    Petrova     Apr 12, 2024
These SAA-C03 Practice tests feel like real exams! They are very accurate and I highly recommend them.
    Henrik Bjornsen     Apr 12, 2024
I am delighted to recommend this website to my friends. I personally used it to prepare for my SAA-C03 exam, and I can attest that the questions and answers were 100% accurate.
    Sarah Johnson     Apr 11, 2024
I recently took the SAA-C03 exam and I'm happy with the report that I passed my AWS Certified Solutions Architect - Associate (SAA-C03) Exam on my first attempt! The questions on the exam were similar to the ones I practiced with through my study materials provided by salesforcexamdumps, which helped me feel confident and prepared.
    Muhammad Talha     Apr 11, 2024
I took the SAA-C03 exam after completing the AWS Solutions Architect Associate Dumps preparation, and I found it to be a natural progression in terms of difficulty. The questions were challenging, but they accurately reflected the skills and knowledge necessary for the role of a Solutions Architect. Today i passed my AWS Certified Solutions Architect - Associate (SAA-C03) Exam with 98% marks.
    Khan     Apr 10, 2024
I am thrilled to have discovered Salesforcexamdumps! It's amazing how easy it is to read, understand, and study each exam section, taking detailed notes. Thank you so much!
    Amelia Collins     Apr 10, 2024
These exam dumps are worth every penny I spent. I passed the SAA-C03 exam with flying colors thanks to these questions. Thanks Salesforcexamdumps.com.

Leave a comment

Your email address will not be published. Required fields are marked *

Rating / Feedback About This Exam