Free AWS-Certified-DevOps-Engineer-Professional Exam Braindumps

Pass your Amazon AWS Certified DevOps Engineer Professional exam with these free Questions and Answers

Page 9 of 28
QUESTION 36

The management team at a company with a large on-premises OpenStack environment wants to move non-production workloads to AWS. An AWS Direct Connect connection has been provisioned and configured to connect the environments. Due to contractual obligations, the production workloads must remain on- premises, and will be moved to AWS after the next contract negotiation. The company follows Center for Internet Security (CIS) standards for hardening images; this configuration was developed using the company's configuration management system.
Which solution will automatically create an identical image in the AWS environment without significant overhead?

  1. A. Write an AWS CloudFormation template that will create an Amazon EC2 instanc
  2. B. Use cloud-unit to install the configuration management agent, use cfn-wait to wait for configuration management to successfully apply, and use an AWS Lambda-backed custom resource to create the AMI.
  3. C. Log in to the console, launch an Amazon EC2 instance, and install the configuration management agent.When changes are applied through the configuration management system, log in to the console and create a new AMI from the instance.
  4. D. Create a new AWS OpsWorks layer and mirror the image hardening standard
  5. E. Use this layer as the baseline for all AWS workloads.
  6. F. When a change is made in the configuration management system, a job in Jenkins is triggered to use the VM Import command to create an Amazon EC2 instance in the Amazon VP
  7. G. Use lifecycle hooks to launch an AWS Lambda function to create the AMI.

Correct Answer: D
https://www.brad-x.com/2015/10/01/importing-an-openstack-vm-into-amazon-ec2/ https://aws.amazon.com/ec2/vm-import/

QUESTION 37

A DevOps Engineer must automate a weekly process of identifying unnecessary permissions on a per-user basis, across all users in an AWS account. This process should evaluate the permissions currently granted to each user by examining the user's attached IAM access policies compared to the permissions the user has actually used in the past 90 days. Any differences in the comparison would indicate that the user has more permissions than are required. A report of the deltas should be sent to the Information Security team for further review and IAM user access policy revisions, as required.
Which solution is fully automated and will produce the MOST detailed deltas report?

  1. A. Create an AWS Lambda function that calls the IAM Access Advisor API to pull service permissions granted on a user-by-user basis for all users in the AWS accoun
  2. B. Ensure that Access Advisor is configured with a tracking period of 90 day
  3. C. Invoke the Lambda function using an Amazon CloudWatch Events rule on a weekly schedul
  4. D. For each record, by user, by service, if the Access Advisor Last Accesses field indicates a day count instead of "Not accesses in the tracking period," this indicates a delta compared to what is in the user's currently attached access police
  5. E. After Lambda has iterated through all users in the AWS account, configure it to generate a report and send the report using Amazon SES.
  6. F. Configure an AWS CloudTrail trail that spans all AWS Regions and all read/write events, and point this trail to an Amazon S3 bucke
  7. G. Create Amazon Athena table and specify the S3 bucket ARN in the CREATE TABLE quer
  8. H. Create an AWS Lambda function that accesses the Athena table using the SDK, which performs a SELECT, ensuring that the WHERE clause includes userIdentity, eventName, and eventTim
  9. I. Compare the results against the user's currently attached IAM access policies to determine any delta
  10. J. Configure an Amazon CloudWatch Events schedule to automate this process to run once a wee
  11. K. Configure Amazon SES to send a consolidated report to the Information Security team.
  12. L. Configure VPC Flow Logs on all subnets across all VPCs in all regions to capture user traffic across the entire accoun
  13. M. Ensure that all logs are being sent to a centralized Amazon S3 bucket, so all flow logs can be consolidated and aggregate
  14. N. Create an AWS Lambda function that is triggered once a week by an Amazon CloudWatch Events schedul
  15. O. Ensure that the Lambda function parses the flow log files for the following information: IAM user ID, subnet ID, VPC ID, Allow/Reject status per API call, and service nam
  16. P. Then have the function determine the deltas on a user-by-user basi
  17. Q. Configure the Lambda function to send the consolidated report using Amazon SES.
  18. R. Create an Amazon ES cluster and note its endpoint URL, which will be provided as an environment variable into a Lambda functio
  19. S. Configure an Amazon S3 event on a AWS CloudTrail trail destination S3 bucket and ensure that the event is configured to send to a Lambda functio
  20. T. Create the Lambda function to consume the events, parse the input from JSON, and transform it to an Amazon ES document forma
  21. . POST the documents to the Amazon ES cluster's endpoint by way of the passed-in environment variabl
  22. . Make sure that the proper indexing exists in Amazon ES and use Apache Lucene queries to parse the permissions on a user-by-user basi
  23. . Export the deltas into a report and have Amazon ES send the reports to the Information Security team using Amazon SES every week.

Correct Answer: C

QUESTION 38

An ecommerce company is running an application on AWS. The company wants to create a standby disaster recovery solution in an additional Region that keeps the current application code. The application runs on Amazon EC2 instances behind an Application Load Balancer (ALB). The instances run in an EC2 Auto Scaling group across multiple Availability Zones. The database layer is hosted on an Amazon RDS MySQL Multi-AZ DB instance. Amazon Route 53 DNS records point to the ALB.
Which combination of actions will meet these requirements with the LOWEST cost? (Select THREE.)

  1. A. Configure a failover routing policy for the application DNS entry.
  2. B. Configure a geolocation routing policy for the application DNS entry.
  3. C. Create a cross-Region RDS read replica in the new standby Region.
  4. D. Migrate the database layer to Amazon DynamoDB and enable global replication to the new standby Region.
  5. E. Provision the ALB and Auto Scaling group in the new standby Region and set the desired capacity to match the active Region.
  6. F. Provision the ALB and Auto Scaling group in the new standby Region and set the desired capacity to 1.

Correct Answer: AEF

QUESTION 39

A company is using an AWS CodeBuild project to build and package an application. The packages are copied to a shared Amazon S3 bucket before being deployed across multiple AWS accounts.
The buildspec.yml file contains the following:
AWS-Certified-DevOps-Engineer-Professional dumps exhibit
The DevOps Engineer has noticed that anybody with an AWS account is able to download the artifacts. What steps should the DevOps Engineer take to stop this?

  1. A. Modify the post_build to command to use ""-acl public-read and configure a bucket policy that grants read access to the relevant AWS accounts only.
  2. B. Configure a default ACL for the S3 bucket that defines the set of authenticated users as the relevant AWS accounts only and grants read-only access.
  3. C. Create an S3 bucket policy that grants read access to the relevant AWS accounts and denies read access to the principal "*"
  4. D. Modify the post_build command to remove ""-acl authenticated-read and configure a bucket policy that allows read access to the relevant AWS accounts only.

Correct Answer: D

QUESTION 40

A company has deployed several applications globally. Recently, Security Auditors found that few Amazon EC2 instances were launched without Amazon EBS disk encryption. The Auditors have requested a report detailing all EBS volumes that were not encrypted in multiple AWS accounts and regions. They also want to be notified whenever this occurs in future.
How can this be automated with the LEAST amount of operational overhead?

  1. A. Create an AWS Lambda function to set up an AWS Config rule on all the target account
  2. B. Use AWS Config aggregators to collect data from multiple accounts and region
  3. C. Export the aggregated report to an Amazon S3 bucket and use Amazon SNS to deliver the notifications.
  4. D. Set up AWS CloudTrail to deliver all events to an Amazon S3 bucket in a centralized accoun
  5. E. Use the S3 event notification feature to invoke an AWS Lambda function to parse AWS CloudTrail logs whenever logs are delivered to the S3 bucke
  6. F. Publish the output to an Amazon SNS topic using the same Lambda function.
  7. G. Create an AWS CloudFormation template that adds an AWS Config managed rule for EBS encryption.Use a CloudFormation stack set to deploy the template across all accounts and region
  8. H. Store consolidated evaluation results from config rules in Amazon S3. Send a notification using Amazon SNS when non- compliant resources are detected.
  9. I. Using AWS CLI, run a script periodically that invokes the aws ec2 describe-volumes query with a JMESPATH query filte
  10. J. Then, write the output to an Amazon S3 bucke
  11. K. Set up an S3 event notification to send events using Amazon SNS when new data is written to the S3 bucket.

Correct Answer: C
https://aws.amazon.com/blogs/aws/aws-config-update-aggregate-compliance-data-across-accounts-regions/ https://docs.aws.amazon.com/config/latest/developerguide/aws-config-managed-rules-cloudformation-templates

Page 9 of 28

Post your Comments and Discuss Amazon AWS-Certified-DevOps-Engineer-Professional exam with other Community members: