Winter Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: myex65

Home > Microsoft > Microsoft Certified - Fabric Analytics Engineer Associate > DP-600

DP-600 Implementing Analytics Solutions Using Microsoft Fabric Question and Answers

Question # 4

You need to create a DAX measure to calculate the average overall satisfaction score.

How should you complete the DAX code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 5

You need to design a semantic model for the customer satisfaction report.

Which data source authentication method and mode should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 6

What should you recommend using to ingest the customer data into the data store in the AnatyticsPOC workspace?

A.

a stored procedure

B.

a pipeline that contains a KQL activity

C.

a Spark notebook

D.

a dataflow

Full Access
Question # 7

NO: 8

You need to implement the date dimension in the data store. The solution must meet the technical requirements.

What are two ways to achieve the goal? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

A.

Populate the date dimension table by using a dataflow.

B.

Populate the date dimension table by using a Stored procedure activity in a pipeline.

C.

Populate the date dimension view by using T-SQL.

D.

Populate the date dimension table by using a Copy activity in a pipeline.

Full Access
Question # 8

You need to ensure the data loading activities in the AnalyticsPOC workspace are executed in the appropriate sequence. The solution must meet the technical requirements.

What should you do?

A.

Create a pipeline that has dependencies between activities and schedule the pipeline.

B.

Create and schedule a Spark job definition.

C.

Create a dataflow that has multiple steps and schedule the dataflow.

D.

Create and schedule a Spark notebook.

Full Access
Question # 9

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Fabric tenant that contains a semantic model named Model1.

You discover that the following query performs slowly against Model1.

You need to reduce the execution time of the query.

Solution: You replace line 4 by using the following code:

Does this meet the goal?

A.

Yes

B.

No

Full Access
Question # 10

You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a subfolder named Subfolder1 that contains CSV files. You need to convert the CSV files into the delta format that has V-Order optimization enabled. What should you do from Lakehouse explorer?

A.

Use the Load to Tables feature.

B.

Create a new shortcut in the Files section.

C.

Create a new shortcut in the Tables section.

D.

Use the Optimize feature.

Full Access
Question # 11

You to need assign permissions for the data store in the AnalyticsPOC workspace. The solution must meet the security requirements.

Which additional permissions should you assign when you share the data store? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 12

You need to ensure that Contoso can use version control to meet the data analytics requirements and the general requirements. What should you do?

A.

Store all the semantic models and reports in Data Lake Gen2 storage.

B.

Modify the settings of the Research workspaces to use a GitHub repository.

C.

Store all the semantic models and reports in Microsoft OneDrive.

D.

Modify the settings of the Research division workspaces to use an Azure Repos repository.

Full Access
Question # 13

You need to migrate the Research division data for Productline2. The solution must meet the data preparation requirements. How should you complete the code? To answer, select the appropriate options in the answer area

NOTE: Each correct selection is worth one point.

Full Access
Question # 14

You need to recommend a solution to group the Research division workspaces.

What should you include in the recommendation? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 15

Which syntax should you use in a notebook to access the Research division data for Productlinel?

A)

B)

C)

D)

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Full Access
Question # 16

Which workspace rote assignments should you recommend for ResearchReviewersGroupl and ResearchReviewersGroupZ? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 17

You need to recommend which type of fabric capacity SKU meets the data analytics requirements for the Research division. What should you recommend?

A.

EM

B.

F

C.

P

D.

A

Full Access
Question # 18

You have a data warehouse that contains a table named Stage. Customers. Stage-Customers contains all the customer record updates from a customer relationship management (CRM) system. There can be multiple updates per customer

You need to write a T-SQL query that will return the customer ID, name, postal code, and the last updated time of the most recent row for each customer ID.

How should you complete the code? To answer, select the appropriate options in the answer area,

NOTE Each correct selection is worth one point.

Full Access
Question # 19

You need to refresh the Orders table of the Online Sales department. The solution must meet the semantic model requirements. What should you include in the solution?

A.

an Azure Data Factory pipeline that executes a dataflow to retrieve the minimum value of the OrderlD column in the destination lakehouse

B.

an Azure Data Factory pipeline that executes a Stored procedure activity to retrieve the maximum value of the OrderlD column in the destination lakehouse

C.

an Azure Data Factory pipeline that executes a dataflow to retrieve the maximum value of the OrderlD column in the destination lakehouse

D.

an Azure Data Factory pipeline that executes a Stored procedure activity to retrieve the minimum value of the OrderiD column m the

destination lakehouse

Full Access
Question # 20

You need to resolve the issue with the pricing group classification.

How should you complete the T-SQL statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 21

Which type of data store should you recommend in the AnalyticsPOC workspace?

A.

a data lake

B.

a warehouse

C.

a lakehouse

D.

an external Hive metaStore

Full Access