Exam Details

  • Exam Code
    :DP-203
  • Exam Name
    :Data Engineering on Microsoft Azure
  • Certification
    :Microsoft Certifications
  • Vendor
    :Microsoft
  • Total Questions
    :398 Q&As
  • Last Updated
    :Mar 22, 2025

Microsoft Microsoft Certifications DP-203 Questions & Answers

  • Question 51:

    You have an Azure SQL database named DB1 and an Azure Data Factory data pipeline named pipeline.

    From Data Factory, you configure a linked service to DB1.

    In DB1, you create a stored procedure named SP1. SP1 returns a single row of data that has four columns.

    You need to add an activity to pipeline to execute SP1. The solution must ensure that the values in the columns are stored as pipeline variables.

    Which two types of activities can you use to execute SP1? (Refer to Data Engineering on Microsoft Azure documents or guide for Answers/available at Microsoft.com)

    A. Script

    B. Copy

    C. Lookup

    D. Stored Procedure

  • Question 52:

    You have an Azure Synapse Analytics dedicated SQL pool named Pool1 that contains a table named Sales. Sales has row-level security (RLS) applied. RLS uses the following predicate filter.

    A user named SalesUser1 is assigned the db_datareader role for Pool1. Which rows in the Sales table are returned when SalesUser1 queries the table?

    A. only the rows for which the value in the User_Name column is SalesUser1

    B. all the rows

    C. only the rows for which the value in the SalesRep column is Manager

    D. only the rows for which the value in the SalesRep column is SalesUser1

  • Question 53:

    You have an Azure Synapse Analytics dedicated SQL pool that contains a table named Table1. Table1 contains the following:

    1.

    One billion rows

    2.

    A clustered columnstore index

    3.

    A hash-distributed column named Product Key

    4.

    A column named Sales Date that is of the date data type and cannot be null

    Thirty million rows will be added to Table1 each month.

    You need to partition Table1 based on the Sales Date column. The solution must optimize query performance and data loading. How often should you create a partition?

    A. once per month

    B. once per year

    C. once per day

    D. once per week

  • Question 54:

    You have an Azure Databricks workspace that contains a Delta Lake dimension table named Table1.

    Table1 is a Type 2 slowly changing dimension (SCD) table.

    You need to apply updates from a source table to Table1.

    Which Apache Spark SQL operation should you use?

    A. CREATE

    B. UPDATE

    C. MERGE

    D. ALTER

  • Question 55:

    You have an Azure Data Factory pipeline named Pipeline1!. Pipelinel contains a copy activity that sends data to an Azure Data Lake Storage Gen2 account. Pipeline 1 is executed by a schedule trigger.

    You change the copy activity sink to a new storage account and merge the changes into the collaboration branch.

    After Pipelinel executes, you discover that data is NOT copied to the new storage account.

    You need to ensure that the data is copied to the new storage account.

    What should you do?

    A. Publish from the collaboration branch.

    B. Configure the change feed of the new storage account.

    C. Create a pull request.

    D. Modify the schedule trigger.

  • Question 56:

    What should you recommend to prevent users outside the Litware on-premises network from accessing the analytical data store?

    A. a server-level virtual network rule

    B. a database-level virtual network rule

    C. a server-level firewall IP rule

    D. a database-level firewall IP rule

  • Question 57:

    HOTSPOT

    Which Azure Data Factory components should you recommend using together to import the daily inventory data from the SQL server to Azure Data Lake Storage? To answer, select the appropriate options in the answer area.

    NOTE: Each correct selection is worth one point.

    Hot Area:

  • Question 58:

    HOTSPOT

    You need to design the partitions for the product sales transactions. The solution must meet the sales transaction dataset requirements.

    What should you include in the solution? To answer, select the appropriate options in the answer area.

    NOTE: Each correct selection is worth one point.

    Hot Area:

  • Question 59:

    HOTSPOT

    You have an Azure Data Factory pipeline shown the following exhibit.

    The execution log for the first pipeline run is shown in the following exhibit.

    The execution log for the second pipeline run is shown in the following exhibit.

    For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.

    Hot Area:

  • Question 60:

    HOTSPOT

    You have an Azure Synapse Analytics dedicated SQL pool named Pool1 that contains an external table named Sales. Sales contains sales data. Each row in Sales contains data on a single sale, including the name of the salesperson.

    You need to implement row-level security (RLS). The solution must ensure that the salespeople can access only their respective sales.

    What should you do? To answer, select the appropriate options in the answer area.

    NOTE: Each correct selection is worth one point.

    Hot Area:

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Microsoft exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DP-203 exam preparations and Microsoft certification application, do not hesitate to visit our Vcedump.com to find your solutions here.