Free DAS-C01 Exam Braindumps

Pass your AWS Certified Data Analytics - Specialty exam with these free Questions and Answers

Page 7 of 32
QUESTION 26

A company using Amazon QuickSight Enterprise edition has thousands of dashboards analyses and datasets. The company struggles to manage and assign permissions for granting users access to various items within QuickSight. The company wants to make it easier to implement sharing and permissions management.
Which solution should the company implement to simplify permissions management?

  1. A. Use QuickSight folders to organize dashboards, analyses, and datasets Assign individual users permissions to these folders
  2. B. Use QuickSight folders to organize dashboards analyses, and datasets Assign group permissions by using these folders.
  3. C. Use AWS 1AM resource-based policies to assign group permissions to QuickSight items
  4. D. Use QuickSight user management APIs to provision group permissions based on dashboard naming conventions

Correct Answer: C

QUESTION 27

A media company is using Amazon QuickSight dashboards to visualize its national sales data. The dashboard is using a dataset with these fields: ID, date, time_zone, city, state, country, longitude, latitude, sales_volume, and number_of_items.
To modify ongoing campaigns, the company wants an interactive and intuitive visualization of which states across the country recorded a significantly lower sales volume compared to the national average.
Which addition to the company’s QuickSight dashboard will meet this requirement?

  1. A. A geospatial color-coded chart of sales volume data across the country.
  2. B. A pivot table of sales volume data summed up at the state level.
  3. C. A drill-down layer for state-level sales volume data.
  4. D. A drill through to other dashboards containing state-level sales volume data.

Correct Answer: B

QUESTION 28

A company hosts an on-premises PostgreSQL database that contains historical data. An internal legacy application uses the database for read-only activities. The company’s business team wants to move the data to a data lake in Amazon S3 as soon as possible and enrich the data for analytics.
The company has set up an AWS Direct Connect connection between its VPC and its on-premises network. A data analytics specialist must design a solution that achieves the business team’s goals with the least operational overhead.
Which solution meets these requirements?

  1. A. Upload the data from the on-premises PostgreSQL database to Amazon S3 by using a customized batch upload proces
  2. B. Use the AWS Glue crawler to catalog the data in Amazon S3. Use an AWS Glue job to enrich and store the result in a separate S3 bucket in Apache Parquet forma
  3. C. Use Amazon Athena to query the data.
  4. D. Create an Amazon RDS for PostgreSQL database and use AWS Database Migration Service (AWS DMS) to migrate the data into Amazon RD
  5. E. Use AWS Data Pipeline to copy and enrich the data from the Amazon RDS for PostgreSQL table and move the data to Amazon S3. Use Amazon Athena to query the data.
  6. F. Configure an AWS Glue crawler to use a JDBC connection to catalog the data in the on-premises databas
  7. G. Use an AWS Glue job to enrich the data and save the result to Amazon S3 in Apache Parquet forma
  8. H. Create an Amazon Redshift cluster and use Amazon Redshift Spectrum to query the data.
  9. I. Configure an AWS Glue crawler to use a JDBC connection to catalog the data in the on-premises databas
  10. J. Use an AWS Glue job to enrich the data and save the result to Amazon S3 in Apache Parquet forma
  11. K. Use Amazon Athena to query the data.

Correct Answer: B

QUESTION 29

A company has collected more than 100 TB of log files in the last 24 months. The files are stored as raw text in a dedicated Amazon S3 bucket. Each object has a key of the form year-month-day_log_HHmmss.txt where HHmmss represents the time the log file was initially created. A table was created in Amazon Athena that points to the S3 bucket. One-time queries are run against a subset of columns in the table several times an hour.
A data analyst must make changes to reduce the cost of running these queries. Management wants a solution with minimal maintenance overhead.
Which combination of steps should the data analyst take to meet these requirements? (Choose three.)

  1. A. Convert the log files to Apace Avro format.
  2. B. Add a key prefix of the form date=year-month-day/ to the S3 objects to partition the data.
  3. C. Convert the log files to Apache Parquet format.
  4. D. Add a key prefix of the form year-month-day/ to the S3 objects to partition the data.
  5. E. Drop and recreate the table with the PARTITIONED BY claus
  6. F. Run the ALTER TABLE ADD PARTITION statement.
  7. G. Drop and recreate the table with the PARTITIONED BY claus
  8. H. Run the MSCK REPAIR TABLE statement.

Correct Answer: BCF

QUESTION 30

An IoT company wants to release a new device that will collect data to track sleep overnight on an intelligent mattress. Sensors will send data that will be uploaded to an Amazon S3 bucket. About 2 MB of data is generated each night for each bed. Data must be processed and summarized for each user, and the results need to be available as soon as possible. Part of the process consists of time windowing and other functions. Based on tests with a Python script, every run will require about 1 GB of memory and will complete within a couple of minutes.
Which solution will run the script in the MOST cost-effective way?

  1. A. AWS Lambda with a Python script
  2. B. AWS Glue with a Scala job
  3. C. Amazon EMR with an Apache Spark script
  4. D. AWS Glue with a PySpark job

Correct Answer: A

Page 7 of 32

Post your Comments and Discuss Amazon-Web-Services DAS-C01 exam with other Community members: