Are you tired of looking for a source that'll keep you updated on the AWS Certified Solutions Architect - Associate (SAA-C03) Exam? Plus, has a collection of affordable, high-quality, and incredibly easy Amazon SAA-C03 Practice Questions? Well then, you are in luck because Salesforcexamdumps.com just updated them! Get Ready to become a AWS Certified Associate Certified.
|
Add to cart | ||
Test Engine | Demo |
|
Add to cart |
PDF + Test Engine |
|
Add to cart |
Here are Amazon SAA-C03 PDF available features:
550 questions with answers | Updation Date : 03 Oct, 2023 |
1 day study required to pass exam | 100% Passing Assurance |
100% Money Back Guarantee | Free 3 Months Updates |
Students Passed
Average Marks
Questions From Dumps
Total Happy Clients
Amazon SAA-C03 is a necessary certification exam to get certified. The certification is a reward to the deserving candidate with perfect results. The AWS Certified Associate Certification validates a candidate's expertise to work with Amazon. In this fast-paced world, a certification is the quickest way to gain your employer's approval. Try your luck in passing the AWS Certified Solutions Architect - Associate (SAA-C03) Exam and becoming a certified professional today. Salesforcexamdumps.com is always eager to extend a helping hand by providing approved and accepted Amazon SAA-C03 Practice Questions. Passing AWS Certified Solutions Architect - Associate (SAA-C03) will be your ticket to a better future!
Contrary to the belief that certification exams are generally hard to get through, passing AWS Certified Solutions Architect - Associate (SAA-C03) is incredibly easy. Provided you have access to a reliable resource such as Salesforcexamdumps.com Amazon SAA-C03 PDF. We have been in this business long enough to understand where most of the resources went wrong. Passing Amazon AWS Certified Associate certification is all about having the right information. Hence, we filled our Amazon SAA-C03 Dumps with all the necessary data you need to pass. These carefully curated sets of AWS Certified Solutions Architect - Associate (SAA-C03) Practice Questions target the most repeated exam questions. So, you know they are essential and can ensure passing results. Stop wasting your time waiting around and order your set of Amazon SAA-C03 Braindumps now!
We aim to provide all AWS Certified Associate certification exam candidates with the best resources at minimum rates. You can check out our free demo before pressing down the download to ensure Amazon SAA-C03 Practice Questions are what you wanted. And do not forget about the discount. We always provide our customers with a little extra.
Unlike other websites, Salesforcexamdumps.com prioritize the benefits of the AWS Certified Solutions Architect - Associate (SAA-C03) candidates. Not every Amazon exam candidate has full-time access to the internet. Plus, it's hard to sit in front of computer screens for too many hours. Are you also one of them? We understand that's why we are here with the AWS Certified Associate solutions. Amazon SAA-C03 Question Answers offers two different formats PDF and Online Test Engine. One is for customers who like online platforms for real-like Exam stimulation. The other is for ones who prefer keeping their material close at hand. Moreover, you can download or print Amazon SAA-C03 Dumps with ease.
If you still have some queries, our team of experts is 24/7 in service to answer your questions. Just leave us a quick message in the chat-box below or email at [email protected].
A company is deploying an application that processes large quantities of data in parallel.The company plans to use Amazon EC2 instances for the workload. The network architecture must be configurable to prevent groups of nodes from sharing the sameunderlying hardware.Which networking solution meets these requirements?
A. Run the EC2 instances in a spread placement group.
B. Group the EC2 instances in separate accounts.
C. Configure the EC2 instances with dedicated tenancy.
D. Configure the EC2 instances with shared tenancy.
A company is preparing a new data platform that will ingest real-time streaming data frommultiple sources. The company needs to transform the data before writing the data toAmazon S3. The company needs the ability to use SQL to query the transformed data. Which solutions will meet these requirements? (Choose two.)
A. Use Amazon Kinesis Data Streams to stream the data. Use Amazon Kinesis DataAnalytics to transform the data. Use Amazon Kinesis Data Firehose to write the data toAmazon S3. Use Amazon Athena to query the transformed data from Amazon S3.
B. Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to stream the data.Use AWS Glue to transform the data and to write the data to Amazon S3. Use AmazonAthena to query the transformed data from Amazon S3.
C. Use AWS Database Migration Service (AWS DMS) to ingest the data. Use AmazonEMR to transform the data and to write the data to Amazon S3. Use Amazon Athena toquery the transformed data from Amazon S3.
D. Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to stream the data.Use Amazon Kinesis Data Analytics to transform the data and to write the data to AmazonS3. Use the Amazon RDS query editor to query the transformed data from Amazon S3.
E. Use Amazon Kinesis Data Streams to stream the data. Use AWS Glue to transform thedata. Use Amazon Kinesis Data Firehose to write the data to Amazon S3. Use the AmazonRDS query editor to query the transformed data from Amazon S3.
A company runs an application on Amazon EC2 instances. The company needs toimplement a disaster recovery (DR) solution for the application. The DR solution needs tohave a recovery time objective (RTO) of less than 4 hours. The DR solution also needs touse the fewest possible AWS resources during normal operations.Which solution will meet these requirements in the MOST operationally efficient way?
A. Create Amazon Machine Images (AMIs) to back up the EC2 instances. Copy the AMIsto a secondary AWS Region. Automate infrastructure deployment in the secondary Regionby using AWS Lambda and custom scripts.
B. Create Amazon Machine Images (AMIs) to back up the EC2 instances. Copy the AMIs to a secondary AWS Region. Automate infrastructure deployment in the secondary Regionby using AWS CloudFormation.
C. Launch EC2 instances in a secondary AWS Region. Keep the EC2 instances in thesecondary Region active at all times.
D. Launch EC2 instances in a secondary Availability Zone. Keep the EC2 instances in thesecondary Availability Zone active at all times.
A solutions architect needs to review a company's Amazon S3 buckets to discoverpersonally identifiable information (Pll). The company stores the Pll data in the us-east-IRegion and us-west-2 Region.Which solution will meet these requirements with the LEAST operational overhead?
A. Configure Amazon Macie in each Region. Create a job to analyze the data that is inAmazon S3_
B. Configure AWS Security Hub for all Regions. Create an AWS Config rule to analyze thedata that is in Amazon S3_
C. Configure Amazon Inspector to analyze the data that IS in Amazon S3.
D. Configure Amazon GuardDuty to analyze the data that is in Amazon S3.
A solutions architect is designing a workload that will store hourly energy consumption bybusiness tenants in a building. The sensors will feed a database through HTTP requeststhat will add up usage for each tenant. The solutions architect must use managed serviceswhen possible. The workload will receive more features in the future as the solutionsarchitect adds independent components.Which solution will meet these requirements with the LEAST operational overhead?
A. Use Amazon API Gateway with AWS Lambda functions to receive the data from thesensors, process the data, and store the data in an Amazon DynamoDB table.
B. Use an Elastic Load Balancer that is supported by an Auto Scaling group of AmazonEC2 instances to receive and process the data from the sensors. Use an Amazon S3bucket to store the processed data.
C. Use Amazon API Gateway with AWS Lambda functions to receive the data from thesensors, process the data, and store the data in a Microsoft SQL Server Express databaseon an Amazon EC2 instance.
D. Use an Elastic Load Balancer that is supported by an Auto Scaling group of AmazonEC2 instances to receive and process the data from the sensors. Use an Amazon ElasticFile System (Amazon EFS) shared file system to store the processed data.
A company is deploying a new public web application toAWS. The application Will runbehind an Application Load Balancer (ALE). The application needs to be encrypted at theedge with an SSL/TLS certificate that is issued by an external certificate authority (CA).The certificate must be rotated each year before the certificate expires.What should a solutions architect do to meet these requirements?
A. Use AWS Certificate Manager (ACM) to issue an SSUTLS certificate. Apply thecertificate to the ALB Use the managed renewal feature to automatically rotate thecertificate.
B. Use AWS Certificate Manager (ACM) to issue an SSUTLS certificate_ Import the keymaterial from the certificate. Apply the certificate to the ALB Use the managedrenewal teature to automatically rotate the certificate.
C. Use AWS Private Certificate Authority to issue an SSL/TLS certificate from the root CA.Apply the certificate to the ALB. use the managed renewal feature to automatically rotate the certificate
D. Use AWS Certificate Manager (ACM) to import an SSL/TLS certificate. Apply thecertificate to the ALB_ Use Amazon EventBridge to send a notification when the certificateis nearing expiration. Rotate the certificate manually.
A company runs an infrastructure monitoring service. The company is building a newfeature that will enable the service to monitor data in customer AWS accounts. The newfeature will call AWS APIs in customer accounts to describe Amazon EC2 instances andread Amazon CloudWatch metrics.What should the company do to obtain access to customer accounts in the MOST secureway?
A. Ensure that the customers create an 1AM role in their account with read-only EC2 andCloudWatch permissions and a trust policy to the company's account.
B. Create a serverless API that implements a token vending machine to provide temporaryAWS credentials for a role with read-only EC2 and CloudWatch permissions.
C. Ensure that the customers create an 1AM user in their account with read-only EC2 andCloudWatch permissions. Encrypt and store customer access and secret keys in a secretsmanagement system.
D. Ensure that the customers create an Amazon Cognito user in their account to use an1AM role with read-only EC2 and CloudWatch permissions. Encrypt and store the AmazonCognito user and password in a secrets management system.
A company has two VPCs that are located in the us-west-2 Region within the same AWSaccount. The company needs to allow network traffic between these VPCs. Approximately500 GB of data transfer will occur between the VPCs each month.What is the MOST cost-effective solution to connect these VPCs?
A. Implement AWS Transit Gateway to connect the VPCs. Update the route tables of eachVPC to use the transit gateway for inter-VPC communication.
B. Implement an AWS Site-to-Site VPN tunnel between the VPCs. Update the route tablesof each VPC to use the VPN tunnel for inter-VPC communication.
C. Set up a VPC peering connection between the VPCs. Update the route tables of eachVPC to use the VPC peering connection for inter-VPC communication.
D. Set up a 1 GB AWS Direct Connect connection between the VPCs. Update the routetables of each VPC to use the Direct Connect connection for inter-VPC communication.
A company runs a web application that is deployed on Amazon EC2 instances in theprivate subnet of a VPC. An Application Load Balancer (ALB) that extends across thepublic subnets directs web traffic to the EC2 instances. The company wants to implementnew security measures to restrict inbound traffic from the ALB to the EC2 instances whilepreventing access from any other source inside or outside the private subnet of the EC2instances. Which solution will meet these requirements?
A. Configure a route in a route table to direct traffic from the internet to the private IPaddresses of the EC2 instances.
B. Configure the security group for the EC2 instances to only allow traffic that comes fromthe security group for the ALB.
C. Move the EC2 instances into the public subnet. Give the EC2 instances a set of ElasticIP addresses.
D. Configure the security group for the ALB to allow any TCP traffic on any port.
A company runs applications on AWS that connect to the company's Amazon RDSdatabase. The applications scale on weekends and at peak times of the year. Thecompany wants to scale the database more effectively for its applications that connect tothe database.Which solution will meet these requirements with the LEAST operational overhead
A. Use Amazon DynamoDB with connection pooling with a target group configuration forthe database. Change the applications to use the DynamoDB endpoint.
B. Use Amazon RDS Proxy with a target group for the database. Change the applicationsto use the RDS Proxy endpoint.
C. Use a custom proxy that runs on Amazon EC2 as an intermediary to the database.Change the applications to use the custom proxy endpoint.
D. Use an AWS Lambda function to provide connection pooling with a target groupconfiguration for the database. Change the applications to use the Lambda function.
A company runs an application that uses Amazon RDS for PostgreSQL. The applicationreceives traffic only on weekdays during business hours. The company wants to optimizecosts and reduce operational overhead based on this usage.Which solution will meet these requirements?
A. Use the Instance Scheduler on AWS to configure start and stop schedules.
B. Turn off automatic backups. Create weekly manual snapshots of the database.
C. Create a custom AWS Lambda function to start and stop the database based onminimum CPU utilization.
D. Purchase All Upfront reserved DB instances.
A company needs to connect several VPCs in the us-east-1 Region that span hundreds ofAWS accounts. The company's networking team has its own AWS account to manage thecloud network.What is the MOST operationally efficient solution to connect the VPCs?
A. Set up VPC peering connections between each VPC. Update each associated subnet'sroute table.
B. Configure a NAT gateway and an internet gateway in each VPC to connect each VPCthrough the internet.
C. Create an AWS Transit Gateway in the networking team's AWS account. Configurestatic routes from each VPC.
D. Deploy VPN gateways in each VPC. Create a transit VPC in the networking team's AWSaccount to connect to each VPC.
A company has created a multi-tier application for its ecommerce website. The websiteuses an Application Load Balancer that resides in the public subnets, a web tier in thepublic subnets, and a MySQL cluster hosted on Amazon EC2 instances in the privatesubnets. The MySQL database needs to retrieve product catalog and pricing informationthat is hosted on the internet by a third-party provider. A solutions architect must devise astrategy that maximizes security without increasing operational overhead.What should the solutions architect do to meet these requirements?
A. Deploy a NAT instance in the VPC. Route all the internet-based traffic through the NATinstance.
B. Deploy a NAT gateway in the public subnets. Modify the private subnet route table todirect all internet-bound traffic to the NAT gateway.
C. Configure an internet gateway and attach it to the VPC. Modify the private subnet routetable to direct internet-bound traffic to the internet gateway.
D. Configure a virtual private gateway and attach it to the VPC. Modify the private subnetroute table to direct internet-bound traffic to the virtual private gateway.
A solutions architect is designing a highly available Amazon ElastiCache for Redis basedsolution. The solutions architect needs to ensure that failures do not result in performancedegradation or loss of data locally and within an AWS Region. The solution needs toprovide high availability at the node level and at the Region level.Which solution will meet these requirements?
A. Use Multi-AZ Redis replication groups with shards that contain multiple nodes.
B. Use Redis shards that contain multiple nodes with Redis append only files (AOF) turedon.
C. Use a Multi-AZ Redis cluster with more than one read replica in the replication group.
D. Use Redis shards that contain multiple nodes with Auto Scaling turned on.
A company runs its applications on Amazon EC2 instances. The company performsperiodic financial assessments of itsAWS costs. The company recently identified unusualspending.The company needs a solution to prevent unusual spending. The solution must monitorcosts and notify responsible stakeholders in the event of unusual spending.Which solution will meet these requirements?
A. Use an AWS Budgets template to create a zero spend budget
B. Create an AWS Cost Anomaly Detection monitor in the AWS Billing and CostManagement console.
C. CreateAWS Pricing Calculator estimates for the current running workload pricingdetails_
D. Use Amazon CloudWatch to monitor costs and to identify unusual spending
A company wants to use an event-driven programming model with AWS Lambda. Thecompany wants to reduce startup latency for Lambda functions that run on Java 11. Thecompany does not have strict latency requirements for the applications. The companywants to reduce cold starts and outlier latencies when a function scales up.Which solution will meet these requirements MOST cost-effectively?
A. Configure Lambda provisioned concurrency.
B. Increase the timeout of the Lambda functions.
C. Increase the memory of the Lambda functions.
D. Configure Lambda SnapStart.
A company needs to minimize the cost of its 1 Gbps AWS Direct Connect connection. Thecompany's average connection utilization is less than 10%. A solutions architect mustrecommend a solution that will reduce the cost without compromising security.Which solution will meet these requirements?
A. Set up a new 1 Gbps Direct Connect connection. Share the connection with anotherAWS account.
B. Set up a new 200 Mbps Direct Connect connection in the AWS Management Console.
C. Contact an AWS Direct Connect Partner to order a 1 Gbps connection. Share theconnection with another AWS account.
D. Contact an AWS Direct Connect Partner to order a 200 Mbps hosted connection for anexisting AWS account.
A company runs a website that stores images of historical events. Website users need theability to search and view images based on the year that the event in the image occurred.On average, users request each image only once or twice a year The company wants ahighly available solution to store and deliver the images to users.Which solution will meet these requirements MOST cost-effectively?
A. Store images in Amazon Elastic Block Store (Amazon EBS). Use a web server that runson Amazon EC2_
B. Store images in Amazon Elastic File System (Amazon EFS). Use a web server that runson Amazon EC2.
C. Store images in Amazon S3 Standard. use S3 Standard to directly deliver images byusing a static website.
D. Store images in Amazon S3 Standard-InfrequentAccess (S3 Standard-IA). use S3Standard-IA to directly deliver images by using a static website.
A company runs a website that uses a content management system (CMS) on AmazonEC2. The CMS runs on a single EC2 instance and uses an Amazon Aurora MySQL Multi-AZ DB instance for the data tier. Website images are stored on an Amazon Elastic BlockStore (Amazon EBS) volume that is mounted inside the EC2 instance.Which combination of actions should a solutions architect take to improve the performanceand resilience of the website? (Select TWO.)
A. Move the website images into an Amazon S3 bucket that is mounted on every EC2instance.
B. Share the website images by using an NFS share from the primary EC2 instance. Mountthis share on the other EC2 instances.
C. Move the website images onto an Amazon Elastic File System (Amazon EFS) filesystem that is mounted on every EC2 instance.
D. Create an Amazon Machine Image (AMI) from the existing EC2 instance Use the AMI toprovision new instances behind an Application Load Balancer as part of an Auto Scalinggroup. Configure the Auto Scaling group to maintain a minimum of two instances.Configure an accelerator in AWS Global Accelerator for the website.
E. Create an Amazon Machine Image (AMI) from the existing EC2 instance. Use the AMI toprovision new instances behind an Application Load Balancer as part of an Auto Scalinggroup. Configure the Auto Scaling group to maintain a minimum of two instances.Configure an Amazon CloudFront distribution for the website.
A company has an on-premises MySQL database that handles transactional data. Thecompany is migrating the database to the AWS Cloud. The migrated database mustmaintain compatibility with the company's applications that use the database. The migrateddatabase also must scale automatically during periods of increased demand.Which migration solution will meet these requirements?
A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL. Configureelastic storage scaling.
B. Migrate the database to Amazon Redshift by using the mysqldump utility. Turn on AutoScaling for the Amazon Redshift cluster.
C. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonAurora. Turn on Aurora Auto Scaling.
D. Use AWS Database Migration Service (AWS DMS) to migrate the database to AmazonDynamoDB. Configure an Auto Scaling policy.
A company is using AWS Key Management Service (AWS KMS) keys to encrypt AWSLambda environment variables. A solutions architect needs to ensure that the requiredpermissions are in place to decrypt and use the environment variables.Which steps must the solutions architect take to implement the correct permissions?(Choose two.)
A. Add AWS KMS permissions in the Lambda resource policy.
B. Add AWS KMS permissions in the Lambda execution role.
C. Add AWS KMS permissions in the Lambda function policy.
D. Allow the Lambda execution role in the AWS KMS key policy.
E. Allow the Lambda resource policy in the AWS KMS key policy.
A company operates an ecommerce website on Amazon EC2 instances behind anApplication Load Balancer (ALB) in an Auto Scaling group. The site is experiencingperformance issues related to a high request rate from illegitimate external systems withchanging IP addresses. The security team is worried about potential DDoS attacks againstthe website. The company must block the illegitimate incoming requests in a way that has aminimal impact on legitimate users.What should a solutions architect recommend?
A. Deploy Amazon Inspector and associate it with the ALB.
B. Deploy AWS WAF, associate it with the ALB, and configure a rate-limiting rule.
C. Deploy rules to the network ACLs associated with the ALB to block the incoming traffic.
D. Deploy Amazon GuardDuty and enable rate-limiting protection when configuring GuardDuty.
A company uses Amazon Elastic Kubernetes Service (Amazon EKS) to run a containerapplication. The EKS cluster stores sensitive information in the Kubernetes secrets object.The company wants to ensure that the information is encryptedWhich solution will meet these requirements with the LEAST operational overhead?
A. Use the container application to encrypt the information by using AWS Key ManagementService (AWS KMS).
B. Enable secrets encryption in the EKS cluster by using AWS Key Management Service(AWS KMS)_
C. Implement an AWS Lambda tuncüon to encrypt the information by using AWS KeyManagement Service (AWS KMS).
D. use AWS Systems Manager Parameter Store to encrypt the information by using AWSKey Management Service (AWS KMS).
A company runs a three-tier application in two AWS Regions. The web tier, the applicationtier, and the database tier run on Amazon EC2 instances. The company uses AmazonRDS for Microsoft SQL Server Enterprise for the database tier The database tier isexperiencing high load when weekly and monthly reports are run. The company wants toreduce the load on the database tier.Which solution will meet these requirements with the LEAST administrative effort?
A. Create read replicas. Configure the reports to use the new read replicas.
B. Convert the RDS database to Amazon DynamoDB_ Configure the reports to useDynamoDB
C. Modify the existing RDS DB instances by selecting a larger instance size.
D. Modify the existing ROS DB instances and put the instances into an Auto Scaling group.
A company is building an ecommerce application and needs to store sensitive customerinformation. The company needs to give customers the ability to complete purchasetransactions on the website. The company also needs to ensure that sensitive customerdata is protected, even from database administrators.Which solution meets these requirements?
A. Store sensitive data in an Amazon Elastic Block Store (Amazon EBS) volume. Use EBSencryption to encrypt the data. Use an IAM instance role to restrict access.
B. Store sensitive data in Amazon RDS for MySQL. Use AWS Key Management Service(AWS KMS) client-side encryption to encrypt the data.
C. Store sensitive data in Amazon S3. Use AWS Key Management Service (AWS KMS)server-side encryption to encrypt the data. Use S3 bucket policies to restrict access.
D. Store sensitive data in Amazon FSx for Windows Server. Mount the file share onapplication servers. Use Windows file permissions to restrict access.
A company is moving its data and applications to AWS during a multiyear migration project.The company wants to securely access data on Amazon S3 from the company's AWSRegion and from the company's on-premises location. The data must not traverse theinternet. The company has established an AWS Direct Connect connection between itsRegion and its on-premises locationWhich solution will meet these requirements?
A. Create gateway endpoints for Amazon S3. Use the gateway endpoints to securelyaccess the data from the Region and the on-premises location.
B. Create a gateway in AWS Transit Gateway to access Amazon S3 securely from theRegion and the on-premises location.
C. Create interface endpoints for Amazon S3_ Use the interface endpoints to securelyaccess the data from the Region and the on-premises location.
D. Use an AWS Key Management Service (AWS KMS) key to access the data securelyfrom the Region and the on-premises location.
A company has a financial application that produces reports. The reports average 50 KB insize and are stored in Amazon S3. The reports are frequently accessed during the firstweek after production and must be stored for several years. The reports must beretrievable within 6 hours.Which solution meets these requirements MOST cost-effectively?
A. Use S3 Standard. Use an S3 Lifecycle rule to transition the reports to S3 Glacier after 7days.
B. Use S3 Standard. Use an S3 Lifecycle rule to transition the reports to S3 Standard-Infrequent Access (S3 Standard-IA) after 7 days.
C. Use S3 Intelligent-Tiering. Configure S3 Intelligent-Tiering to transition the reports to S3Standard-Infrequent Access (S3 Standard-IA) and S3 Glacier.
D. Use S3 Standard. Use an S3 Lifecycle rule to transition the reports to S3 Glacier DeepArchive after 7 days.
A company needs to store contract documents. A contract lasts for 5 years. During the 5-year period, the company must ensure that the documents cannot be overwritten ordeleted. The company needs to encrypt the documents at rest and rotate the encryptionkeys automatically every year.Which combination of steps should a solutions architect take to meet these requirementswith the LEAST operational overhead? (Select TWO.)
A. Store the documents in Amazon S3. Use S3 Object Lock in governance mode.
B. Store the documents in Amazon S3. Use S3 Object Lock in compliance mode.
C. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3).Configure key rotation.
D. Use server-side encryption with AWS Key Management Service (AWS KMS) customermanaged keys. Configure key rotation.
E. Use server-side encryption with AWS Key Management Service (AWS KMS) customerprovided (imported) keys. Configure key rotation.
A company hosts an internal serverless application on AWS by using Amazon APIGateway and AWS Lambda. The company's employees report issues with high latencywhen they begin using the application each day. The company wants to reduce latency.Which solution will meet these requirements?
A. Increase the API Gateway throttling limit.
B. Set up a scheduled scaling to increase Lambda provisioned concurrency beforeemployees begin to use the application each day.
C. Create an Amazon CloudWatch alarm to initiate a Lambda function as a target for thealarm at the beginning of each day.
D. Increase the Lambda function memory.
A company offers a food delivery service that is growing rapidly. Because of the growth, thecompany’s order processing system is experiencing scaling problems during peak traffichours. The current architecture includes the following:• A group of Amazon EC2 instances that run in an Amazon EC2 Auto Scaling group tocollect orders from the application• Another group of EC2 instances that run in an Amazon EC2 Auto Scaling group to fulfillordersThe order collection process occurs quickly, but the order fulfillment process can takelonger. Data must not be lost because of a scaling event.A solutions architect must ensure that the order collection process and the order fulfillmentprocess can both scale properly during peak traffic hours. The solution must optimizeutilization of the company’s AWS resources.Which solution meets these requirements?
A. Use Amazon CloudWatch metrics to monitor the CPU of each instance in the AutoScaling groups. Configure each Auto Scaling group’s minimum capacity according to peakworkload values.
B. Use Amazon CloudWatch metrics to monitor the CPU of each instance in the AutoScaling groups. Configure a CloudWatch alarm to invoke an Amazon Simple NotificationService (Amazon SNS) topic that creates additional Auto Scaling groups on demand.
C. Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for ordercollection and another for order fulfillment. Configure the EC2 instances to poll theirrespective queue. Scale the Auto Scaling groups based on notifications that the queuessend.
D. Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for ordercollection and another for order fulfillment. Configure the EC2 instances to poll theirrespective queue. Create a metric based on a backlog per instance calculation. Scale theAuto Scaling groups based on this metric.
A company sends AWS CloudTrail logs from multiple AWS accounts to an Amazon S3bucket in a centralized account. The company must keep the CloudTrail logs. Thecompany must also be able to query the CloudTrail logs at any timeWhich solution will meet these requirements?
A. Use the CloudTraiI event history in the centralized account to create an Amazon Athenatable. Query the CloudTrail logs from Athena.
B. Configure an Amazon Neptune instance to manage the CloudTrail logs. Query theCloudTraiI logs from Neptune.
C. Configure CloudTrail to send the logs to an Amazon DynamoDB table. Create adashboard in Amazon QulCkSight to query the logs in the table.
D. use Amazon Athena to create an Athena notebook. Configure CloudTrail to send thelogs to the notebook. Run queries from Athena.
A company stores its data on premises. The amount of data is growing beyond thecompany's available capacity.The company wants to migrate its data from the on-premises location to an Amazon S3bucket The company needs a solution that will automatically validate the integrity of thedata after the transferWhich solution will meet these requirements?
A. Order an AWS Snowball Edge device Configure the Snowball Edge device to performthe online data transfer to an S3 bucket.
B. Deploy an AWS DataSync agent on premises. Configure the DataSync agent to performthe online data transfer to an S3 bucket.
C. Create an Amazon S3 File Gateway on premises. Configure the S3 File Gateway toperform the online data transfer to an S3 bucket
D. Configure an accelerator in Amazon S3 Transfer Acceleration on premises. Configurethe accelerator to perform the online data transfer to an S3 bucket.
A company is building a RESTful serverless web application on AWS by using Amazon APIGateway and AWS Lambda. The users of this web application will be geographicallydistributed, and the company wants to reduce the latency of API requests to these usersWhich type of endpoint should a solutions architect use to meet these requirements?
A. Private endpoint
B. Regional endpoint
C. Interface VPC endpoint
D. Edge-optimzed endpoint
A company runs multiple Amazon EC2 Linux instances in a VPC across two AvailabilityZones. The instances host applications that use a hierarchical directory structure. Theapplications need to read and write rapidly and concurrently to shared storage.What should a solutions architect do to meet these requirements?
A. Create an Amazon S3 bucket. Allow access from all the EC2 instances in the VPC. B. Create an Amazon Elastic File System (Amazon EFS) file system. Mount the EFS filesystem from each EC2 instance.
C. Create a file system on a Provisioned IOPS SSD (102) Amazon Elastic Block Store(Amazon EBS) volume. Attach the EBS volume to all the EC2 instances.
D. Create file systems on Amazon Elastic Block Store (Amazon EBS) volumes that areattached to each EC2 instance. Synchromze the EBS volumes across the different EC2instances.
A solutions architect is using an AWS CloudFormation template to deploy a three-tier web application. The web application consists of a web tier and an application tier that storesand retrieves user data in Amazon DynamoDB tables. The web and application tiers arehosted on Amazon EC2 instances, and the database tier is not publicly accessible. Theapplication EC2 instances need to access the DynamoDB tables without exposing APIcredentials in the template.What should the solutions architect do to meet these requirements?
A. Create an IAM role to read the DynamoDB tables. Associate the role with the applicationinstances by referencing an instance profile.
B. Create an IAM role that has the required permissions to read and write from theDynamoDB tables. Add the role to the EC2 instance profile, and associate the instanceprofile with the application instances.
C. Use the parameter section in the AWS CloudFormation template to have the user inputaccess and secret keys from an already-created IAM user that has the requiredpermissions to read and write from the DynamoDB tables.
D. Create an IAM user in the AWS CloudFormation template that has the requiredpermissions to read and write from the DynamoDB tables. Use the GetAtt function toretrieve the access and secret keys, and pass them to the application instances throughthe user data.
A solutions architect is designing the storage architecture for a new web application usedfor storing and viewing engineering drawings. All application components will be deployedon the AWS infrastructure.The application design must support caching to minimize the amount of time that users waitfor the engineering drawings to load. The application must be able to store petabytes ofdata. Which combination of storage and caching should the solutions architect use?
A. Amazon S3 with Amazon CloudFront
B. Amazon S3 Glacier with Amazon ElastiCache
C. Amazon Elastic Block Store (Amazon EBS) volumes with Amazon CloudFront
D. AWS Storage Gateway with Amazon ElastiCache
A company's website handles millions of requests each day, and the number of requestscontinues to increase. A solutions architect needs to improve the response time of the webapplication. The solutions architect determines that the application needs to decreaselatency when retrieving product details from theAmazon DynamoDB table.Which solution will meet these requirements with the LEAST amount of operationaloverhead?
A. Set up a DynamoDB Accelerator (DAX) cluster. Route all read requests through DAX.
B. Set up Amazon ElastiCache for Redis between the DynamoDB table and the webapplication. Route all read requests through Redis.
C. Set up Amazon ElastiCache for Memcached between the DynamoDB table and the webapplication. Route all read requests through Memcached.
D. Set up Amazon DynamoDB Streams on the table, and have AWS Lambda read from thetable and populate Amazon ElastiCache. Route all read requests through ElastiCache.
A social media company is building a feature for its website. The feature will give users the ability to upload photos. The company expects significant increases in demand during large events and must ensure that the website can handle the upload traffic from users. Which solution meets these requirements with the MOST scalability?
A. Upload files from the user's browser to the application servers. Transfer the files to an
Amazon S3 bucket.
B. Provision an AWS Storage Gateway file gateway. Upload files directly from the user'sbrowser to the file gateway.
C. Generate Amazon S3 presigned URLs in the application. Upload files directly from theuser's browser into an S3 bucket.
D. Provision an Amazon Elastic File System (Amazon EFS) file system Upload files directlyfrom the user's browser to the file system
A company hosts an application on Amazon EC2 instances that run in a single AvailabilityZone. The application is accessible by using the transport layer of the Open SystemsInterconnection (OSI) model. The company needs the application architecture to have highavailabilityWhich combination of steps will meet these requirements MOST cost-effectively? (SelectTWO_)
A. Configure new EC2 instances in a different AvailabiIity Zone. Use Amazon Route 53 toroute traffic to all instances.
B. Configure a Network Load Balancer in front of the EC2 instances.
C. Configure a Network Load Balancer tor TCP traffic to the instances. Configure anApplication Load Balancer tor HTTP and HTTPS traffic to the instances.
D. Create an Auto Scaling group for the EC2 instances. Configure the Auto Scaling groupto use multiple Availability Zones. Configure the Auto Scaling group to run applicationhealth checks on the instances_
E. Create an Amazon CloudWatch alarm. Configure the alarm to restart EC2 instances thattransition to a stopped state
A company has a service that reads and writes large amounts of data from an Amazon S3bucket in the same AWS Region. The service is deployed on Amazon EC2 instances withinthe private subnet of a VPC. The service communicates with Amazon S3 over a NATgateway in the public subnet. However, the company wants a solution that will reduce thedata output costs.Which solution will meet these requirements MOST cost-effectively?
A. Provision a dedicated EC2 NAT instance in the public subnet. Configure the route tablefor the private subnet to use the elastic network interface of this instance as the destinationfor all S3 traffic.
B. Provision a dedicated EC2 NAT instance in the private subnet. Configure the route tablefor the public subnet to use the elastic network interface of this instance as the destinationfor all S3 traffic.
C. Provision a VPC gateway endpoint. Configure the route table for the private subnet touse the gateway endpoint as the route for all S3 traffic.
D. Provision a second NAT gateway. Configure the route table for the private subnet to usethis NAT gateway as the destination for all S3 traffic.
A company is planning to use an Amazon DynamoDB table for data storage. The companyis concerned about cost optimization. The table will not be used on most mornings. In theevenings, the read and write traffic will often be unpredictable. When traffic spikes occur,they will happen very quickly.What should a solutions architect recommend?
A. Create a DynamoDB table in on-demand capacity mode.
B. Create a DynamoDB table with a global secondary inde
C. Create a DynamoDB table with provisioned capacity and auto scaling.
D. Create a DynamoDB table in provisioned capacity mode, and configure it as a global table
A manufacturing company has machine sensors that upload .csv files to an Amazon S3 bucket. These .csv files must be converted into images and must be made available as soon as possible for the automatic generation of graphical reports. The images become irrelevant after 1 month, but the .csv files must be kept to train machine learning (ML) models twice a year. The ML trainings and audits are planned weeks in advance. Which combination of steps will meet these requirements MOST cost-effectively? (Select TWO.)
A. Launch an Amazon EC2 Spot Instance that downloads the .csv files every hour,
generates the image files, and uploads the images to the S3 bucket.
B. Design an AWS Lambda function that converts the .csv files into images and stores the
images in the S3 bucket. Invoke the Lambda function when a .csv file is uploaded.
C. Create S3 Lifecycle rules for .csv files and image files in the S3 bucket. Transition the .csv files from S3 Standard to S3 Glacier 1 day after they are uploaded. Expire the image files after 30 days.
D. Create S3 Lifecycle rules for .csv files and image files in the S3 bucket. Transition the .csv files from S3 Standard to S3 One Zone-Infrequent Access (S3 One Zone-IA) 1 day after they are uploaded. Expire the image files after 30 days.
E. Create S3 Lifecycle rules for .csv files and image files in the S3 bucket. Transition the .csv files from S3 Standard to S3 Standard-Infrequent Access (S3 Standard-IA) 1 day after they are uploaded. Keep the image files in Reduced Redundancy Storage (RRS).
A company wants to move from many standalone AWS accounts to a consolidated, multiaccount architecture The company plans to create many new AWS accounts for different business units. The company needs to authenticate access to these AWS accounts by using a centralized corporate directory service. Which combination of actions should a solutions architect recommend to meet these requirements? (Select TWO.)
A. Create a new organization in AWS Organizations with all features turned on. Create the
new AWS accounts in the organization.
B. Set up an Amazon Cognito identity pool. Configure AWS 1AM Identity Center (AWS
Single Sign-On) to accept Amazon Cognito authentication.
C. Configure a service control policy (SCP) to manage the AWS accounts. Add AWS 1AM Identity Center (AWS Single Sign-On) to AWS Directory Service.
D. Create a new organization in AWS Organizations. Configure the organization's authentication mechanism to use AWS Directory Service directly.
E. Set up AWS 1AM Identity Center (AWS Single Sign-On) in the organization. Configure 1AM Identity Center, and integrate it with the company's corporate directory service.
A company is deploying a new application on Amazon EC2 instances. The application writes data to Amazon Elastic Block Store (Amazon EBS) volumes. The company needs to ensure that all data that is written to the EBS volumes is encrypted at rest. Which solution will meet this requirement?
A. Create an 1AM role that specifies EBS encryption. Attach the role to the EC2 instances.
B. Create the EBS volumes as encrypted volumes. Attach the EBS volumes to the EC2 instances
C. Create an EC2 instance tag that has a key of Encrypt and a value of True. Tag all instances that require encryption at the EBS level.
D. Create an AWS Key Management Service (AWS KMS) key policy that enforces EBS encryption in the account. Ensure that the key policy is active
A serverless application uses Amazon API Gateway. AWS Lambda, and Amazon DynamoDB. The Lambda function needs permissions to read and write to the DynamoDB table. Which solution will give the Lambda function access to the DynamoDB table MOST securely?
A. Create an 1AM user with programmatic access to the Lambda function. Attach a policy
to the user that allows read and write access to the DynamoDB table. Store the
access_key_id and secret_access_key parameters as part of the Lambda environment
variables. Ensure that other AWS users do not have read and write access to the Lambda
function configuration
B. Create an 1AM role that includes Lambda as a trusted service. Attach a policy to the role that allows read and write access to the DynamoDB table. Update the configuration of the Lambda function to use the new role as the execution role.
C. Create an 1AM user with programmatic access to the Lambda function. Attach a policy to the user that allows read and write access to the DynamoDB table. Store the access_key_id and secret_access_key parameters in AWS Systems Manager Parameter Store as secure string parameters. Update the Lambda function code to retrieve the secure string parameters before connecting to the DynamoDB table.
D. Create an 1AM role that includes DynamoDB as a trusted service. Attach a policy to the role that allows read and write access from the Lambda function. Update the code of the Lambda function to attach to the new role as an execution role.
A company wants to use artificial intelligence (Al) to determine the quality of its customer service calls. The company currently manages calls in four different languages, including English. The company will offer new languages in the future. The company does not have the resources to regularly maintain machine learning (ML) models. The company needs to create written sentiment analysis reports from the customer service call recordings. The customer service call recording text must be translated into English. Which combination of steps will meet these requirements? (Select THREE.)
A. Use Amazon Comprehend to translate the audio recordings into English.
B. Use Amazon Lex to create the written sentiment analysis reports.
C. Use Amazon Polly to convert the audio recordings into text.
D. Use Amazon Transcribe to convert the audio recordings in any language into text.
E. Use Amazon Translate to translate text in any language to English.
F. Use Amazon Comprehend to create the sentiment analysis reports.
A company needs to configure a real-time data ingestion architecture for its application. The company needs an API. a process that transforms data as the data is streamed, and a storage solution for the data. Which solution will meet these requirements with the LEAST operational overhead?
A. Deploy an Amazon EC2 instance to host an API that sends data to an Amazon Kinesis
data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the
Kinesis data stream as a data source. Use AWS Lambda functions to transform the data.
Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
B. Deploy an Amazon EC2 instance to host an API that sends data to AWS Glue. Stop source/destination checking on the EC2 instance. Use AWS Glue to transform the data and to send the data to Amazon S3.
C. Configure an Amazon API Gateway API to send data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
D. Configure an Amazon API Gateway API to send data to AWS Glue. Use AWS Lambda functions to transform the data. Use AWS Glue to send the data to Amazon S3.
A 4-year-old media company is using the AWS Organizations all features feature set fo organize its AWS accounts. According to he company's finance team, the billing information on the member accounts must not be accessible to anyone, including the root user of the member accounts. Which solution will meet these requirements?
A. Add all finance team users to an IAM group. Attach an AWS managed policy named
Billing to the group.
B. Attach an identity-based policy to deny access to the billing information to all users, including the root user.
C. Create a service control policy (SCP) to deny access to the billing information. Attach the SCP to the root organizational unit (OU).
D. Convert from the Organizations all features feature set to the Organizations consolidated billing feature set.
A company uses on-premises servers to host its applications The company is running out of storage capacity. The applications use both block storage and NFS storage. The company needs a high-performing solution that supports local caching without rearchitecting its existing applications. Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.)
A. Mount Amazon S3 as a file system to the on-premises servers.
B. Deploy an AWS Storage Gateway file gateway to replace NFS storage.
C. Deploy AWS Snowball Edge to provision NFS mounts to on-premises servers.
D. Deploy an AWS Storage Gateway volume gateway to replace the block storage
E. Deploy Amazon Elastic File System (Amazon EFS) volumes and mount them to onpremises servers.
A company operates a two-tier application for image processing. The application uses two Availability Zones, each with one public subnet and one private subnet. An Application Load Balancer (ALB) for the web tier uses the public subnets. Amazon EC2 instances for the application tier use the private subnets. Users report that the application is running more slowly than expected. A security audit of the web server log files shows that the application is receiving millions of illegitimate requests from a small number of IP addresses. A solutions architect needs to resolve the immediate performance problem while the company investigates a more permanent solution. What should the solutions architect recommend to meet this requirement?
A. Modify the inbound security group for the web tier. Add a deny rule for the IP addresses
that are consuming resources.
B. Modify the network ACL for the web tier subnets. Add an inbound deny rule for the IP addresses that are consuming resources
C. Modify the inbound security group for the application tier. Add a deny rule for the IP addresses that are consuming resources.
D. Modify the network ACL for the application tier subnets. Add an inbound deny rule for the IP addresses that are consuming resources
A gaming company uses Amazon DynamoDB to store user information such as geographic location, player data, and leaderboards. The company needs to configure continuous backups to an Amazon S3 bucket with a minimal amount of coding. The backups must not affect availability of the application and must not affect the read capacity units (RCUs) that are defined for the table Which solution meets these requirements?
A. Use an Amazon EMR cluster. Create an Apache Hive job to back up the data to
Amazon S3.
B. Export the data directly from DynamoDB to Amazon S3 with continuous backups. Turn on point-in-time recovery for the table.
C. Configure Amazon DynamoDB Streams. Create an AWS Lambda function to consume the stream and export the data to an Amazon S3 bucket.
D. Create an AWS Lambda function to export the data from the database tables to Amazon S3 on a regular basis. Turn on point-in-time recovery for the table.
A company has an on-premises server that uses an Oracle database to process and store customer information The company wants to use an AWS database service to achieve higher availability and to improve application performance. The company also wants to offload reporting from its primary database system. Which solution will meet these requirements in the MOST operationally efficient way?
A. Use AWS Database Migration Service (AWS DMS) to create an Amazon RDS DB
instance in multiple AWS Regions Point the reporting functions toward a separate DB
instance from the primary DB instance.
B. Use Amazon RDS in a Single-AZ deployment to create an Oracle database Create a read replica in the same zone as the primary DB instance. Direct the reporting functions to the read replica.
C. Use Amazon RDS deployed in a Multi-AZ cluster deployment to create an Oracle database Direct the reporting functions to use the reader instance in the cluster deployment
D. Use Amazon RDS deployed in a Multi-AZ instance deployment to create an Amazon Aurora database. Direct the reporting functions to the reader instances.
A company has a small Python application that processes JSON documents and outputs the results to an on-premises SQL database. The application runs thousands of times each day. The company wants to move the application to the AWS Cloud. The company needs a highly available solution that maximizes scalability and minimizes operational overhead. Which solution will meet these requirements?
A. Place the JSON documents in an Amazon S3 bucket. Run the Python code on multiple
Amazon EC2 instances to process the documents. Store the results in an Amazon Aurora
DB cluster
B. Place the JSON documents in an Amazon S3 bucket. Create an AWS Lambda function that runs the Python code to process the documents as they arrive in the S3 bucket. Store the results in an Amazon Aurora DB cluster.
C. Place the JSON documents in an Amazon Elastic Block Store (Amazon EBS) volume. Use the EBS Multi-Attach feature to attach the volume to multiple Amazon EC2 instances. Run the Python code on the EC2 instances to process the documents. Store the results on an Amazon RDS DB instance.
D. Place the JSON documents in an Amazon Simple Queue Service (Amazon SQS) queue as messages Deploy the Python code as a container on an Amazon Elastic Container Service (Amazon ECS) cluster that is configured with the Amazon EC2 launch type. Use the container to process the SQS messages. Store the results on an Amazon RDS DB instance.
A company wants lo build a web application on AWS. Client access requests to the website are not predictable and can be idle for a long time. Only customers who have paid a subscription fee can have the ability to sign in and use the web application. Which combination of steps will meet these requirements MOST cost-effectively? (Select THREE.)
A. Create an AWS Lambda function to retrieve user information from Amazon DynamoDB.
Create an Amazon API Gateway endpoint to accept RESTful APIs. Send the API calls to
the Lambda function.
B. Create an Amazon Elastic Container Service (Amazon ECS) service behind an Application Load Balancer to retrieve user information from Amazon RDS. Create an Amazon API Gateway endpoint to accept RESTful APIs. Send the API calls to the Lambda function.
C. Create an Amazon Cogmto user pool to authenticate users D. Create an Amazon Cognito identity pool to authenticate users.
E. Use AWS Amplify to serve the frontend web content with HTML. CSS, and JS. Use an integrated Amazon CloudFront configuration.
F. Use Amazon S3 static web hosting with PHP. CSS. and JS. Use Amazon CloudFront to serve the frontend web content.
A company runs a three-tier web application in the AWS Cloud that operates across three Availability Zones. The application architecture has an Application Load Balancer, an Amazon EC2 web server that hosts user session states, and a MySQL database that runs on an EC2 instance. The company expects sudden increases in application traffic. The company wants to be able to scale to meet future application capacity demands and to ensure high availability across all three Availability Zones. Which solution will meet these requirements?
A. Migrate the MySQL database to Amazon RDS for MySQL with a Multi-AZ DB cluster
deployment. Use Amazon ElastiCache for Redis with high availability to store session data
and to cache reads. Migrate the web server to an Auto Scaling group that is in three
Availability Zones.
B. Migrate the MySQL database to Amazon RDS for MySQL with a Multi-AZ DB cluster deployment. Use Amazon ElastiCache for Memcached with high availability to store session data and to cache reads. Migrate the web server to an Auto Scaling group that is in three Availability Zones.
C. Migrate the MySQL database to Amazon DynamoDB. Use DynamoDB Accelerator (DAX) to cache reads. Store the session data in DynamoDB. Migrate the web server to an Auto Scaling group that is in three Availability Zones.
D. Migrate the MySQL database to Amazon RDS for MySQL in a single Availability Zone. Use Amazon ElastiCache for Redis with high availability to store session data and to cache reads. Migrate the web server to an Auto Scaling group that is in three Availability Zones.
A company runs a website that uses a content management system (CMS) on Amazon EC2. The CMS runs on a single EC2 instance and uses an Amazon Aurora MySQL MultiAZ DB instance for the data tier. Website images are stored on an Amazon Elastic Block Store (Amazon EBS) volume that is mounted inside the EC2 instance. Which combination of actions should a solutions architect take to improve the performance and resilience of the website? (Select TWO.)
A. Move the website images into an Amazon S3 bucket that is mounted on every EC2
instance.
B. Share the website images by using an NFS share from the primary EC2 instance. Mount this share on the other EC2 instances.
C. Move the website images onto an Amazon Elastic File System (Amazon EFS) file system that is mounted on every EC2 instance.
D. Create an Amazon Machine Image (AMI) from the existing EC2 instance Use the AMI to provision new instances behind an Application Load Balancer as part of an Auto Scaling group. Configure the Auto Scaling group to maintain a minimum of two instances. Configure an accelerator in AWS Global Accelerator for the website.
E. Create an Amazon Machine Image (AMI) from the existing EC2 instance. Use the AMI to provision new instances behind an Application Load Balancer as part of an Auto Scaling group. Configure the Auto Scaling group to maintain a minimum of two instances. Configure an Amazon CloudFront distribution for the website.
A company moved its on-premises PostgreSQL database to an Amazon RDS for PostgreSQL DB instance. The company successfully launched a new product. The workload on the database has increased. The company wants to accommodate the larger workload without adding infrastructure. Which solution will meet these requirements MOST cost-effectively?
A. Buy reserved DB instances for the total workload. Make the Amazon RDS for
PostgreSQL DB instance larger.
B. Make the Amazon RDS for PostgreSQL DB instance a Multi-AZ DB instance.
C. Buy reserved DB instances for the total workload. Add another Amazon RDS for PostgreSQL DB instance.
D. Make the Amazon RDS for PostgreSQL DB instance an on-demand DB instance.
A company's data platform uses an Amazon Aurora MySQL database. The database has multiple read replicas and multiple DB instances across different Availability Zones. Users have recently reported errors from the database that indicate that there are too many connections. The company wants to reduce the failover time by 20% when a read replica is promoted to primary writer. Which solution will meet this requirement?
A. Switch from Aurora to Amazon RDS with Multi-AZ cluster deployment.
B. Use Amazon RDS Proxy in front of the Aurora database.
C. Switch to Amazon DynamoDB with DynamoDB Accelerator (DAX) for read connections.
D. Switch to Amazon Redshift with relocation capability.
A company uses Amazon EC2 instances to host its internal systems. As part of a deployment operation, an administrator tries to use the AWS CLI to terminate an EC2 instance. However, the administrator receives a 403 (Access Denied) error message. The administrator is using an IAM role that has the following IAM policy attached: What is the cause of the unsuccessful request?
A. The EC2 instance has a resource-based policy with a Deny statement.
B. The principal has not been specified in the policy statement
C. The "Action" field does not grant the actions that are required to terminate the EC2
instance.
D. The request to terminate the EC2 instance does not originate from the CIDR blocks 192.0.2.0/24 or 203.0 113.0/24
A company uses Amazon API Gateway to run a private gateway with two REST APIs in the same VPC. The BuyStock RESTful web service calls the CheckFunds RESTful web service to ensure that enough funds are available before a stock can be purchased. The company has noticed in the VPC flow logs that the BuyStock RESTful web service calls the CheckFunds RESTful web service over the internet instead of through the VPC. A solutions architect must implement a solution so that the APIs communicate through the VPC. Which solution will meet these requirements with the FEWEST changes to the code? (Select Correct Option/s and give detailed explanation from AWS Certified Solutions Architect - Associate (SAA-C03) Study Manual or documents)
A. Add an X-APl-Key header in the HTTP header for authorization.
B. Use an interface endpoint.
C. Use a gateway endpoint.
D. Add an Amazon Simple Queue Service (Amazon SQS) queue between the two REST APIs.
A company has multiple Windows file servers on premises. The company wants to migrate and consolidate its files into an Amazon FSx for Windows File Server file system. File permissions must be preserved to ensure that access rights do not change. Which solutions will meet these requirements? (Select TWO.)
A. Deploy AWS DataSync agents on premises. Schedule DataSync tasks to transfer the
data to the FSx for Windows File Server file system.
B. Copy the shares on each file server into Amazon S3 buckets by using the AWS CLI Schedule AWS DataSync tasks to transfer the data to the FSx for Windows File Server file system.
C. Remove the drives from each file server Ship the drives to AWS for import into Amazon
S3. Schedule AWS DataSync tasks to transfer the data to the FSx for Windows File Server
file system
D. Order an AWS Snowcone device. Connect the device to the on-premises network. Launch AWS DataSync agents on the device. Schedule DataSync tasks to transfer the data to the FSx for Windows File Server file system,
E. Order an AWS Snowball Edge Storage Optimized device. Connect the device to the onpremises network. Copy data to the device by using the AWS CLI. Ship the device back to AWS for import into Amazon S3. Schedule AWS DataSync tasks to transfer the data to the FSx for Windows File Server file system.
A company is running a microservices application on Amazon EC2 instances. The company wants to migrate the application to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster for scalability. The company must configure the Amazon EKS control plane with endpoint private access set to true and endpoint public access set to false to maintain security compliance The company must also put the data plane in private subnets. However, the company has received error notifications because the node cannot join the cluster. Which solution will allow the node to join the cluster?
A. Grant the required permission in AWS Identity and Access Management (1AM) to the
AmazonEKSNodeRole 1AM role.
B. Create interface VPC endpoints to allow nodes to access the control plane.
C. Recreate nodes in the public subnet Restrict security groups for EC2 nodes
D. Allow outbound traffic in the security group of the nodes.
A company wants to create an application to store employee data in a hierarchical structured relationship. The company needs a minimum-latency response to high-traffic queries for the employee data and must protect any sensitive data. The company also needs to receive monthly email messages if any financial information is present in the employee data. Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)
A. Use Amazon Redshift to store the employee data in hierarchies. Unload the data to
Amazon S3 every month.
B. Use Amazon DynamoDB to store the employee data in hierarchies. Export the data to Amazon S3 every month.
C. Configure Amazon fvlacie for the AWS account. Integrate Macie with Amazon EventBridge to send monthly events to AWS Lambda.
D. Use Amazon Athena to analyze the employee data in Amazon S3. Integrate Athena with Amazon QuickSight to publish analysis dashboards and share the dashboards with users.
E. Configure Amazon Macie for the AWS account Integrate Macie with Amazon EventBridge to send monthly notifications through an Amazon Simple Notification Service (Amazon SNS) subscription.
A company wants to use high-performance computing and artificial intelligence to improve its fraud prevention and detection technology. The company requires distributed processing to complete a single workload as quickly as possible. Which solution will meet these requirements?
A. Use Amazon Elastic Kubernetes Service (Amazon EKS) and multiple containers.
B. Use AWS ParallelCluster and the Message Passing Interface (MPI) libraries.
C. Use an Application Load Balancer and Amazon EC2 instances.
D. Use AWS Lambda functions.
A company runs container applications by using Amazon Elastic Kubernetes Service (Amazon EKS) and the Kubernetes Horizontal Pod Autoscaler. The workload is not consistent throughout the day. A solutions architect notices that the number of nodes does not automatically scale out when the existing nodes have reached maximum capacity in the cluster, which causes performance issues Which solution will resolve this issue with the LEAST administrative overhead?
A. Scale out the nodes by tracking the memory usage
B. Use the Kubernetes Cluster Autoscaler to manage the number of nodes in the cluster.
C. Use an AWS Lambda function to resize the EKS cluster automatically.
D. Use an Amazon EC2 Auto Scaling group to distribute the workload.
A global marketing company has applications that run in the ap-southeast-2 Region and the eu-west-1 Region. Applications that run in a VPC in eu-west-1 need to communicate securely with databases that run in a VPC in ap-southeast-2. Which network design will meet these requirements?
A. Create a VPC peering connection between the eu-west-1 VPC and the ap-southeast-2
VPC. Create an inbound rule in the eu-west-1 application security group that allows traffic
from the database server IP addresses in the ap-southeast-2 security group.
B. Configure a VPC peering connection between the ap-southeast-2 VPC and the eu-west1 VPC. Update the subnet route tables. Create an inbound rule in the ap-southeast-2 database security group that references the security group ID of the application servers in eu-west-1.
C. Configure a VPC peering connection between the ap-southeast-2 VPC and the eu-west1 VPC. Update the subnet route tables Create an inbound rule in the ap-southeast-2 database security group that allows traffic from the eu-west-1 application server IP addresses.
D. Create a transit gateway with a peering attachment between the eu-west-1 VPC and the ap-southeast-2 VPC. After the transit gateways are properly peered and routing is configured, create an inbound rule in the database security group that references the security group ID of the application servers in eu-west-1.
A company migrated a MySQL database from the company's on-premises data center to an Amazon RDS for MySQL DB instance. The company sized the RDS DB instance to meet the company's average daily workload. Once a month, the database performs slowly when the company runs queries for a report. The company wants to have the ability to run reports and maintain the performance of the daily workloads. Which solution will meet these requirements?
A. Create a read replica of the database. Direct the queries to the read replica.
B. Create a backup of the database. Restore the backup to another DB instance. Direct the queries to the new database.
C. Export the data to Amazon S3. Use Amazon Athena to query the S3 bucket.
D. Resize the DB instance to accommodate the additional workload.
A company has migrated multiple Microsoft Windows Server workloads to Amazon EC2 instances that run in the us-west-1 Region. The company manually backs up the workloads to create an image as needed. In the event of a natural disaster in the us-west-1 Region, the company wants to recover workloads quickly in the us-west-2 Region. The company wants no more than 24 hours of data loss on the EC2 instances. The company also wants to automate any backups of the EC2 instances. Which solutions will meet these requirements with the LEAST administrative effort? (Select TWO.)
A. Create an Amazon EC2-backed Amazon Machine Image (AMI) lifecycle policy to create
a backup based on tags. Schedule the backup to run twice daily. Copy the image on
demand.
B. Create an Amazon EC2-backed Amazon Machine Image (AMI) lifecycle policy to create a backup based on tags. Schedule the backup to run twice daily. Configure the copy to the us-west-2 Region.
C. Create backup vaults in us-west-1 and in us-west-2 by using AWS Backup. Create a backup plan for the EC2 instances based on tag values. Create an AWS Lambda function to run as a scheduled job to copy the backup data to us-west-2.
D. Create a backup vault by using AWS Backup. Use AWS Backup to create a backup plan for the EC2 instances based on tag values. Define the destination for the copy as us-west2. Specify the backup schedule to run twice daily.
E. Create a backup vault by using AWS Backup. Use AWS Backup to create a backup plan for the EC2 instances based on tag values. Specify the backup schedule to run twice daily. Copy on demand to us-west-2.
A company runs a container application by using Amazon Elastic Kubernetes Service (Amazon EKS). The application includes microservices that manage customers and place orders. The company needs to route incoming requests to the appropriate microservices. Which solution will meet this requirement MOST cost-effectively?
A. Use the AWS Load Balancer Controller to provision a Network Load Balancer.
B. Use the AWS Load Balancer Controller to provision an Application Load Balancer.
C. Use an AWS Lambda function to connect the requests to Amazon EKS.
D. Use Amazon API Gateway to connect the requests to Amazon EKS.
An application uses an Amazon RDS MySQL DB instance. The RDS database is becoming low on disk space. A solutions architect wants to increase the disk space without downtime. Which solution meets these requirements with the LEAST amount of effort?
A. Enable storage autoscaling in RDS.
B. Increase the RDS database instance size.
C. Change the RDS database instance storage type to Provisioned IOPS.
D. Back up the RDS database, increase the storage capacity, restore the database, and stop the previous instance
A social media company wants to allow its users to upload images in an application that is hosted in the AWS Cloud. The company needs a solution that automatically resizes the images so that the images can be displayed on multiple device types. The application experiences unpredictable traffic patterns throughout the day. The company is seeking a highly available solution that maximizes scalability. What should a solutions architect do to meet these requirements?
A. Create a static website hosted in Amazon S3 that invokes AWS Lambda functions to
resize the images and store the images in an Amazon S3 bucket.
B. Create a static website hosted in Amazon CloudFront that invokes AWS Step Functions to resize the images and store the images in an Amazon RDS database.
C. Create a dynamic website hosted on a web server that runs on an Amazon EC2 instance Configure a process that runs on the EC2 instance to resize the images and store the images in an Amazon S3 bucket.
D. Create a dynamic website hosted on an automatically scaling Amazon Elastic Container Service (Amazon ECS) cluster that creates a resize job in Amazon Simple Queue Service (Amazon SQS). Set up an image-resizing program that runs on an Amazon EC2 instance to process the resize jobs
A retail company uses a regional Amazon API Gateway API for its public REST APIs. The API Gateway endpoint is a custom domain name that points to an Amazon Route 53 alias record. A solutions architect needs to create a solution that has minimal effects on customers and minimal data loss to release the new version of APIs. Which solution will meet these requirements?
A. Create a canary release deployment stage for API Gateway. Deploy the latest API
version. Point an appropriate percentage of traffic to the canary stage. After API
verification, promote the canary stage to the production stage.
B. Create a new API Gateway endpoint with a new version of the API in OpenAPI YAML file format. Use the import-to-update operation in merge mode into the API in API Gateway. Deploy the new version of the API to the production stage.
C. Create a new API Gateway endpoint with a new version of the API in OpenAPI JSON file format. Use the import-to-update operation in overwrite mode into the API in API Gateway. Deploy the new version of the API to the production stage.
D. Create a new API Gateway endpoint with new versions of the API definitions. Create a custom domain name for the new API Gateway API. Point the Route 53 alias record to the
new API Gateway API custom domain name.
A company is expecting rapid growth in the near future. A solutions architect needs to configure existing users and grant permissions to new users on AWS. The solutions architect has decided to create 1AM groups. The solutions architect will add the new users to 1AM groups based on department. Which additional action is the MOST secure way to grant permissions to the new users?
A. Apply service control policies (SCPs) to manage access permissions.
B. Create IAM roles that have least privilege permission. Attach the roles to the 1AM groups.
C. Create an IAM policy that grants least privilege permission. Attach the policy to the 1AM groups.
D. Create 1AM roles. Associate the roles with a permissions boundary that defines the maximum permissions.
A medical research lab produces data that is related to a new study. The lab wants to make the data available with minimum latency to clinics across the country for their on-premises, file-based applications. The data files are stored in an Amazon S3 bucket that has readonly permissions for each clinic. What should a solutions architect recommend to meet these requirements?
A. Deploy an AWS Storage Gateway file gateway as a virtual machine (VM) on premises at
each clinic
B. Migrate the files to each clinic’s on-premises applications by using AWS DataSync for processing.
C. Deploy an AWS Storage Gateway volume gateway as a virtual machine (VM) on premises at each clinic.
D. Attach an Amazon Elastic File System (Amazon EFS) file system to each clinic’s onpremises servers.
A company has hired a solutions architect to design a reliable architecture for its application. The application consists of one Amazon RDS DB instance and two manually provisioned Amazon EC2 instances that run web servers. The EC2 instances are located in a single Availability Zone. An employee recently deleted the DB instance, and the application was unavailable for 24 hours as a result. The company is concerned with the overall reliability of its environment. What should the solutions architect do to maximize reliability of the application's infrastructure?
A. Delete one EC2 instance and enable termination protection on the other EC2 instance. Update the DB instance to be Multi-AZ, and enable deletion protection.
B. Update the DB instance to be Multi-AZ, and enable deletion protection. Place the EC2 instances behind an Application Load Balancer, and run them in an EC2 Auto Scaling group across multiple Availability Zones.
C. Create an additional DB instance along with an Amazon API Gateway and an AWS Lambda function. Configure the application to invoke the Lambda function through API Gateway. Have the Lambda function write the data to the two DB instances.
D. Place the EC2 instances in an EC2 Auto Scaling group that has multiple subnets located in multiple Availability Zones. Use Spot Instances instead of On-Demand Instances. Set up Amazon CloudWatch alarms to monitor the health of the instances. Update the DB instance to be Multi-AZ, and enable deletion protection.
A solutions architect is implementing a complex Java application with a MySQL database. The Java application must be deployed on Apache Tomcat and must be highly available. What should the solutions architect do to meet these requirements?
A. Deploy the application in AWS Lambda. Configure an Amazon API Gateway API to
connect with the Lambda functions.
B. Deploy the application by using AWS Elastic Beanstalk. Configure a load-balanced environment and a rolling deployment policy.
C. Migrate the database to Amazon ElastiCache. Configure the ElastiCache security group to allow access from the application.
D. Yauch an Amazon EC2 instance. Install a MySQL server on the EC2 instance. Configure the application on the server. Create an AMI. Use the AMI to create a launch template with an Auto caling group.
A company stores data in Amazon S3. According to regulations, the data must not contain personally identifiable information (Pll). The company recently discovered that S3 buckets have some objects that contain Pll. The company needs to automatically detect Pll in S3 buckets and to notify the company's security team. Which solution will meet these requirements?
A. Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData
event type from Macie findings and to send an Amazon Simple Notification Service
(Amazon SNS) notification to the security team.
B. Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.
C. Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData:S30bject/Personal event type from Macie findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.
D. Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.
A company hosts its application in the AWS Cloud. The application runs on Amazon EC2 instances behind an Elastic Load Balancer in an Auto Scaling group and with an Amazon DynamoDB table. The ‘company wants to ensure the application can be made available in another AWS Region with minimal downtime. What should a solutions architect do to meet these requirements with the LEAST amount of downtime?
A. Create an Auto Scaling group and a load balancer in the disaster recovery Region.
Configure the DynamoDB table as a global table. Configure DNS failover to point to the
new disaster recovery Region's load balancer.
B. Create an AWS CloudFormation template to create EC2 instances, load balancers, and DynamoDB tables to be launched when needed. Configure DNS failover to point to the new disaster recovery Region's load balancer.
C. Create an AWS CloudFormation template to create EC2 instances and a load balancer to be launched when needed. Configure the DynamoDB table as a global table. Configure DNS failover to point to the new disaster recovery Region's load balancer.
D. Create an Auto Scaling group and load balancer in the disaster recovery Region. Configure the DynamoDB table as a global table. Create an Amazon CloudWatch alarm to trigger an AWS Lambda function that updates Amazon Route 53 pointing to the disaster recovery load balancer.
A company wants to move from many standalone AWS accounts to a consolidated, multiaccount architecture The company plans to create many new AWS accounts for different business units. The company needs to authenticate access to these AWS accounts by using a centralized corporate directory service. Which combination of actions should a solutions architect recommend to meet these requirements? (Select TWO.)
A. Create a new organization in AWS Organizations with all features turned on. Create the
new AWS accounts in the organization.
B. Set up an Amazon Cognito identity pool. Configure AWS 1AM Identity Center (AWS Single Sign-On) to accept Amazon Cognito authentication.
C. Configure a service control policy (SCP) to manage the AWS accounts. Add AWS 1AM Identity Center (AWS Single Sign-On) to AWS Directory Service.
D. Create a new organization in AWS Organizations. Configure the organization's authentication mechanism to use AWS Directory Service directly.
E. Set up AWS 1AM Identity Center (AWS Single Sign-On) in the organization. Configure 1AM Identity Center, and integrate it with the company's corporate directory service.
A company has developed a new video game as a web application. The application is in a three-tier architecture in a VPC with Amazon RDS for MySQL in the database layer. Several players will compete concurrently online. The game's developers want to display a top-10 scoreboard in near-real time and offer the ability to stop and restore the game while preserving the current scores. What should a solutions architect do to meet these requirements?
A. Set up an Amazon ElastiCache for Memcached cluster to cache the scores for the web
application to display.
B. Set up an Amazon ElastiCache for Redis cluster to compute and cache the scores for the web application to display.
C. Place an Amazon CloudFront distribution in front of the web application to cache the scoreboard in a section of the application.
D. Create a read replica on Amazon RDS for MySQL to run queries to compute the scoreboard and serve the read traffic to the web application.
A company needs to migrate a MySQL database from its on-premises data center to AWS within 2 weeks. The database is 20 TB in size. The company wants to complete the migration with minimal downtime. Which solution will migrate the database MOST cost-effectively?
A. Order an AWS Snowball Edge Storage Optimized device. Use AWS Database Migration
Service (AWS DMS) with AWS Schema Conversion Tool (AWS SCT) to migrate the
database with replication of ongoing changes. Send the Snowball Edge device to AWS to
finish the migration and continue the ongoing replication.
B. Order an AWS Snowmobile vehicle. Use AWS Database Migration Service (AWS DMS) with AWS Schema Conversion Tool (AWS SCT) to migrate the database wjgh ongoing changes. Send the Snowmobile vehicle back to AWS to finish the migration and continue the ongoing replication.
C. Order an AWS Snowball Edge Compute Optimized with GPU device. Use AWS Database Migration Service (AWS DMS) with AWS Schema Conversion Tool (AWS SCT) to migrate the database with ongoing changes. Send the Snowball device to AWS to finish the migration and continue the ongoing replication.
D. Order a 1 GB dedicated AWS Direct Connect connection to establish a connection with the data center. Use AWS Database Migration Service (AWS DMS) with AWS Schema Conversion Tool(AWS SCT) to migrate the database with replication of ongoing changes.
A company wants to host a scalable web application on AWS. The application will be accessed by users from different geographic regions of the world. Application users will be able to download and upload unique data up to gigabytes in size. The development team wants a cost-effective solution to minimize upload and download latency and maximize performance. What should a solutions architect do to accomplish this?
A. Use Amazon S3 with Transfer Acceleration to host the application.
B. Use Amazon S3 with CacheControl headers to host the application.
C. Use Amazon EC2 with Auto Scaling and Amazon CloudFront to host the application.
D. Use Amazon EC2 with Auto Scaling and Amazon ElastiCache to host the application.
A company used an Amazon RDS for MySQL DB instance during application testing. Before terminating the DB instance at the end of the test cycle, a solutions architect created two backups. The solutions architect created the first backup by using the mysqldump utility to create a database dump. The solutions architect created the second backup by enabling the final DB snapshot option on RDS termination. The company is now planning for a new test cycle and wants to create a new DB instance from the most recent backup. The company has chosen a MySQL-compatible edition of Amazon Aurora to host the DB instance. Which solutions will create the new DB instance? (Select TWO.)
A. Import the RDS snapshot directly into Aurora.
B. Upload the RDS snapshot to Amazon S3. Then import the RDS snapshot into Aurora.
C. Upload the database dump to Amazon S3. Then import the database dump into Aurora.
D. Use AWS Database Migration Service (AWS DMS) to import the RDS snapshot into Aurora.
E. Upload the database dump to Amazon S3. Then use AWS Database Migration Service (AWS DMS) to import the database dump into Aurora.
Leave a comment
Your email address will not be published. Required fields are marked *