DP-200

DP-200 Test 1

0%

You have an Azure Data Factory pipeline that loads source data into an Azure Data Lake Storage Gen2 account incrementally.

The source table's LastUpdatedDate column identifies the data to be loaded.

Every four hours, you intend to run the pipeline.

You must make certain that the pipeline's execution fits the following criteria:

When a pipeline run fails owing to concurrency or throttling constraints, the execution is automatically retried.

Backfilling existing data in the table is supported.

What kind of trigger should you employ?

Correct! Wrong!

Explanation:
Tumbling window functions divide a data stream into discrete temporal segments and apply a function to each of them. Tumbling windows are distinguished by the fact that they recur, do not overlap, and an event cannot belong to more than one.

Pool1 is an Azure Synapse Analytics dedicated SQL pool, and DB1 is a database. Table1 is a fact table found in DB1.

In Table1, you must determine the amount of the data skew.

In Synapse Studio, what should you do?

Correct! Wrong!

The correct answer:
Connect to Pool! and query sys.dm_pdw_nodes_db_partition_stats

Azure Stream Analytics is used to receive Twitter data from Azure Event Hubs and send it to an Azure Blob storage account.

Every minute, you must output the number of tweets from the previous five minutes.

Which windowing mode should you employ?

Correct! Wrong!

Explanation:
The timeunit, windowsize (how long each window lasts), and hopsize are the three parameters that make up a hopping window specification (by how much each window moves forward relative to the previous one). Offsetsize can also be utilized as an optional fourth parameter.

You're working on an Azure Data Lake Storage Gen 2 application.

You must propose a method for granting permissions to a certain application for a set length of time.

What should your advice include?

Correct! Wrong!

Explanation:
A shared access signature (SAS) allows you to delegate secure access to your storage account's resources. You have fine control over how a client can access your data with an SAS.

You're utilizing Azure metrics to keep track of an Azure Stream Analytics task.

You notice that the average watermark delay over the last 12 hours has consistently exceeded the configured late arrival tolerance.

What could be the source of this behavior?

Correct! Wrong!

The correct answer:
The job's resources are insufficient to handle the volume of incoming data.

You have a virtual network that protects your Azure Data Lake Storage Gen2 account, adls2.

In Azure Synapse, you're creating a SQL pool with adls2 as a source.

What method should you use to connect to adls2?

Correct! Wrong!

Explanation:
Managed identity is an Azure Active Directory feature that allows you to attach an identity to a variety of Azure resources without having to worry about managing the identity's credentials. You can use this identity to log in to any Azure AD-enabled service, such as Microsoft Graph, Key Vault, custom APIs, and so on.

In Azure Synapse Analytics, you have a data warehouse.
You must ensure that c rests.
What do you want to enable?

Correct! Wrong!

Explanation:
Transparent Data Encryption (commonly abbreviated to TDE) is a database encryption method used by Microsoft, IBM, and Oracle. TDE provides file-level encryption. TDE addresses the issue of data security at rest by encrypting databases on the hard drive and, as a result, on backup media.

You have a subscription to Azure that includes the following resources:
* A SQL API account for Azure Cosmos DB
* An encryption key named key1 is stored in an Azure key vault named vault1.
By utilizing key1, you must verify that all content stored in Azure Cosmos DB is encrypted. What should you start with?

Correct! Wrong!

The correct answer:
Create a managed identity

You're keeping an eye on an Azure Stream Analytics job.
For the last hour, the number of Backlogged Input Events has been 20.
The number of Backlogged Input Events must be reduced.
What are your options?

Correct! Wrong!

The correct answer:
Increase the streaming units for the job