We regularly update our Microsoft DP-200 Exam Questions, following is the glimpse of the latest DP-200 Exam Questions updated in our Microsoft DP-200 Exam preparation products. Buy Microsoft DP-200 Exam preparation material listed above to avail full set of updated exam preparation material.
You have an Azure Data Factory that contains 10 pipelines.
You need to label each pipeline with its main purpose of either ingest, transform, or load. The labels must be available for grouping and filtering when using the monitoring experience in Data Factory.
What should you add to each pipeline?
You have an alert on a SQL pool in Azure Synapse that uses the signal logic shown in the exhibit.
On the same day, failures occur at the following times:
08:01
08:03
08:04
08:06
08:11
08:16
08:19
The evaluation period starts on the hour.
At which times will alert notifications be sent?
You have a SQL pool in Azure Synapse.
You discover that some queries fail or take a long time to complete.
You need to monitor for transactions that have rolled back.
Which dynamic management view should you query?
You plan to build a structured streaming solution in Azure Databricks. The solution will count new events in five-minute intervals and report only events that arrive during the interval. The output will be sent to a Delta Lake table.
Which output mode should you use?
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure subscription that contains an Azure Storage account.
You plan to implement changes to a data storage solution to meet regulatory and compliance standards.
Every day, Azure needs to identify and delete blobs that were NOT modified during the last 100 days.
Solution: You schedule an Azure Data Factory pipeline with a delete activity.
Does this meet the goal?