Designing Microsoft Azure Infrastructure Solutions
Exam Details
Exam Code
:AZ-305
Exam Name
:Designing Microsoft Azure Infrastructure Solutions
Certification
:Microsoft Certifications
Vendor
:Microsoft
Total Questions
:378 Q&As
Last Updated
:Mar 25, 2025
Microsoft Microsoft Certifications AZ-305 Questions & Answers
Question 21:
HOTSPOT
You plan to use Azure SQL as a database platform.
You need to recommend an Azure SQL product and service tier that meets the following requirements:
1.
Automatically scales compute resources based on the workload demand
2.
Provides per second billing
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: A single Azure SQL Database
Azure SQL product
Serverless is a compute tier for single databases in Azure SQL Database that automatically scales compute based on workload demand and bills for the amount of compute used per second.
The serverless compute tier for single databases in Azure SQL Database is parameterized by a compute autoscaling range and an auto-pause delay.
Box 2: General Purpose
Service tier
The serverless compute tier is available in the General Purpose service tier and currently in preview in the Hyperscale service tier.
You need to deploy a solution that will provide point-in-time restore for blobs in storage accounts that have blob versioning and blob soft delete enabled.
Which type of blob should you create, and what should you enable for the accounts? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: Block Point-in-time restore for block blobs Point-in-time restore provides protection against accidental deletion or corruption by enabling you to restore block blob data to an earlier state. Point-in-time restore is useful in scenarios where a user or application accidentally deletes data or where an application error corrupts data. Point-in-time restore also enables testing scenarios that require reverting a data set to a known state before running further tests.
Point-in-time restore is supported for general-purpose v2 storage accounts in the standard performance tier only. Only data in the hot and cool access tiers can be restored with point-in-time restore.
Box 2: The change feed Enable and configure point-in-time restore Before you enable and configure point-in-time restore, enable its prerequisites for the storage account: soft delete, change feed, and blob versioning.
You are designing a storage solution that will ingest, store, and analyze petabytes (PBs) of structured, semi-structured, and unstructured text data. The analyzed data will be offloaded to Azure Data Lake Storage Gen2 for long-term retention.
You need to recommend a storage and analytics solution that meets the following requirements:
1.
Stores the processed data
2.
Provides interactive analytics
3.
Supports manual scaling, built-in autoscaling, and custom autoscaling
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: Azure Data Lake Analytics Stores the processed data Provides interactive analytics
Azure Data Lake Analytics is an on-demand analytics job service that simplifies big data. Instead of deploying, configuring, and tuning hardware, you write queries to transform your data and extract valuable insights. The analytics service can handle jobs of any scale instantly by setting the dial for how much power you need. You only pay for your job when it's running, making it cost-effective.
U-SQL: simple and familiar, powerful, and extensible Data Lake Analytics includes U-SQL, a query language that extends the familiar, simple, declarative nature of SQL with the expressive power of C#. The U-SQL language uses the same distributed runtime that powers Microsoft's internal exabyte-scale data lake. SQL and .NET developers can now process and analyze their data with the skills they already have.
Box 2: U-SQL Get started with U-SQL in Azure Data Lake Analytics U-SQL is a language that combines declarative SQL with imperative C# to let you process data at any scale. Through the scalable, distributed-query capability of U-SQL, you can efficiently analyze data across relational stores such as Azure SQL Database. With U-SQL, you can process unstructured data by applying schema on read and inserting custom logic and UDFs. Additionally, U-SQL includes extensibility that gives you fine-grained control over how to execute at scale.
You have an Azure subscription. The subscription contains an Azure SQL managed instance that stores employee details, including social security numbers and phone numbers.
You need to configure the managed instance to meet the following requirements:
1.
The helpdesk team must see only the last four digits of an employee's phone number.
2.
Cloud administrators must be prevented from seeing the employee's social security numbers.
What should you enable for each column in the managed instance? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: Dynamic data masking
The helpdesk team must see only the last four digits of an employee's phone number.
Dynamic data masking helps prevent unauthorized access to sensitive data by enabling customers to designate how much of the sensitive data to reveal with minimal effect on the application layer. It's a policy-based security feature that hides
the sensitive data in the result set of a query over designated database fields, while the data in the database isn't changed.
Masking functions: A set of methods that control the exposure of data for different scenarios.
* Credit card
Masking method, which exposes the last four digits of the designated fields and adds a constant string as a prefix in the form of a credit card.
XXXX-XXXX-XXXX-1234
Box 2: Always Encrypted
Cloud administrators must be prevented from seeing the employee's social security numbers.
Always Encrypted is a feature designed to protect sensitive data, such as credit card numbers or national/regional identification numbers (for example, U.S. social security numbers), stored in Azure SQL Database, Azure SQL Managed
Instance, and SQL Server databases. Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the encryption keys to the Database Engine. This provides a separation between those who own the
data and can view it, and those who manage the data but should have no access - on-premises database administrators, cloud database operators, or other high-privileged unauthorized users. As a result, Always Encrypted enables
customers to confidently store their sensitive data in the cloud, and to reduce the likelihood of data theft by malicious insiders.
You are designing a data analytics solution that will use Azure Synapse and Azure Data Lake Storage Gen2.
You need to recommend Azure Synapse pools to meet the following requirements:
1.
Ingest data from Data Lake Storage into hash-distributed tables.
2.
Implement query, and update data in Delta Lake.
What should you recommend for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: A dedicated SQL pool
Ingest data from Data Lake Storage into hash distributed tables.
Guidance for designing distributed tables using dedicated SQL pool in Azure Synapse Analytics
You can design hash-distributed and round-robin distributed tables in dedicated SQL pools.
Box 2: A serverless SQL pool
Implement query, and update data in Delta Lake.
You can query Delta Lake files using serverless SQL pool in Azure Synapse Analytics
You can write a query using serverless Synapse SQL pool to read Delta Lake files. Delta Lake is an open-source storage layer that brings ACID (atomicity, consistency, isolation, and durability) transactions to Apache Spark and big data
workloads.
The serverless SQL pool in Synapse workspace enables you to read the data stored in Delta Lake format, and serve it to reporting tools. A serverless SQL pool can read Delta Lake files that are created using Apache Spark, Azure Databricks,
You have an Azure subscription. The subscription contains 100 virtual machines that run Windows Server 2022 and have the Azure Monitor Agent installed.
You need to recommend a solution that meets the following requirements:
Forwards JSON-formatted logs from the virtual machines to a Log Analytics workspace
Transforms the logs and stores the data in a table in the Log Analytics workspace
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: An Azure Monitor data collection endpoint
Forwards JSON-formatted logs from the virtual machines to a Log Analytics workspace
Data collection endpoints (DCEs) provide a connection for certain data sources of Azure Monitor.
Data sources that use DCEs
The following data sources currently use DCEs:
Azure Monitor Agent when network isolation is required
Logs ingestion API
Logs Ingestion API in Azure Monitor
The Logs Ingestion API in Azure Monitor lets you send data to a Log Analytics workspace using either a REST API call or client libraries. By using this API, you can send data to supported Azure tables or to custom tables that you create. You
can even extend the schema of Azure tables with custom columns to accept additional data.
Basic operation
Your application sends data to a data collection endpoint (DCE), which is a unique connection point for your subscription. The payload of your API call includes the source data formatted in JSON. The call:
Specifies a data collection rule (DCR) that understands the format of the source data.
Potentially filters and transforms the data for the target table.
Directs the data to a specific table in a specific workspace.
You can modify the target table and workspace by modifying the DCR without any change to the API call or source data.
Incorrect:
* A linked storage account for the Log Analytics workspace
Box 2: A KQL query
Transforms the logs and stores the data in a table in the Log Analytics workspace
Transformations in Azure Monitor allow you to filter or modify incoming data before it's stored in a Log Analytics workspace. They are implemented as a Kusto Query Language (KQL) statement in a data collection rule (DCR).
Transformation structure
The KQL statement is applied individually to each entry in the data source. It must understand the format of the incoming data and create output in the structure of the target table. The input stream is represented by a virtual table named
source with columns matching the input data stream definition. Following is a typical example of a transformation. This example includes the following functionality:
Filters the incoming data with a where statement
Adds a new column using the extend operator
Formats the output to match the columns of the target table using the project operator
You plan to deploy five storage accounts that will store block blobs and five storage accounts that will host file shares. The file shares will be accessed by using the SMB protocol.
You need to recommend an access authorization solution for the storage accounts. The solution must meet the following requirements:
1.
Maximize security.
2.
Prevent the use of shared keys.
3.
Whenever possible, support time-limited access.
What should you include in the solution? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: A shared access signature (SAS) and a stored access policy Blobs
See note below.
User delegation SAS
A user delegation SAS is secured with Azure Active Directory (Azure AD) credentials and also by the permissions specified for the SAS. A user delegation SAS applies to Blob storage only.
A shared access signature can take one of the following two forms:
Ad hoc SAS. When you create an ad hoc SAS, the start time, expiry time, and permissions are specified in the SAS URI. Any type of SAS can be an ad hoc SAS.
Service SAS with stored access policy. A stored access policy is defined on a resource container, which can be a blob container, table, queue, or file share. The stored access policy can be used to manage constraints for one or more service
shared access signatures. When you associate a service SAS with a stored access policy, the SAS inherits the constraints-the start time, expiry time, and permissions-defined for the stored access policy.
Note
A user delegation SAS or an account SAS must be an ad hoc SAS. Stored access policies are not supported for the user delegation SAS or the account SAS.
Box 2: Azure AD credentials
File shares
User delegation SAS
A user delegation SAS is secured with Azure Active Directory (Azure AD) credentials and also by the permissions specified for the SAS. A user delegation SAS applies to Blob storage only.
Note: A shared access signature (SAS) provides secure delegated access to resources in your storage account. With a SAS, you have granular control over how a client can access your data. For example:
What resources the client may access.
What permissions they have to those resources.
How long the SAS is valid.
Types of shared access signatures
Azure Storage supports three types of shared access signatures:
You have an Azure subscription that contains the resources shown in the following table:
Log files from App1 are registered to App1Logs. An average of 120 GB of log data is ingested per day.
You configure an Azure Monitor alert that will be triggered if the App1 logs contain error messages.
You need to minimize the Log Analytics costs associated with App1. The solution must meet the following requirements:
Ensure that all the log files from App1 are ingested to App1Logs.
Minimize the impact on the Azure Monitor alert.
Which resource should you modify, and which modification should you perform? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: Workspace1
Resource
Box 2: Change to a commitment pricing tier
Modification
Commitment tiers
In addition to the pay-as-you-go model, Log Analytics has commitment tiers, which can save you as much as 30 percent compared to the pay-as-you-go price. With commitment tier pricing, you can commit to buy data ingestion for a
workspace, starting at 100 GB per day, at a lower price than pay-as-you-go pricing. Any usage above the commitment level (overage) is billed at that same price per GB as provided by the current commitment tier.
Incorrect:
*Change to the Basic Logs data plan.
Would not support alerts.
Note: Azure Monitor Logs offers two log data plans that let you reduce log ingestion and retention costs and take advantage of Azure Monitor's advanced features and analytics capabilities based on your needs:
The default Analytics log data plan provides full analysis capabilities and makes log data available for queries, Azure Monitor features, such as alerts, and use by other services.
The Basic log data plan lets you save on the cost of ingesting and storing high-volume verbose logs in your Log Analytics workspace for debugging, troubleshooting, and auditing, but not for analytics and alerts.
* Set a daily cap
A daily cap would not guarantee that all log files are ingested.
Set daily cap on Log Analytics workspace
A daily cap on a Log Analytics workspace allows you to avoid unexpected increases in charges for data ingestion by stopping collection of billable data for the rest of the day whenever a specified threshold is reached.
You have an Azure subscription that contains 50 Azure SQL databases.
You create an Azure Resource Manager (ARM) template named Template1 that enables Transparent Data Encryption (TDE).
You need to create an Azure Policy definition named Policy1 that will use Template1 to enable TDE for any noncompliant Azure SQL databases.
How should you configure Policy1? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Correct Answer:
Box 1: DeployIfNotExists DeployIfNotExists Similar to AuditIfNotExists, a DeployIfNotExists policy definition executes a template deployment when the condition is met.
DeployIfNotExists evaluation DeployIfNotExists runs after a configurable delay when a Resource Provider handles a create or update subscription or resource request and has returned a success status code. A template deployment occurs if there are no related resources or if the resources defined by ExistenceCondition don't evaluate to true. The duration of the deployment depends on the complexity of resources included in the template.
During an evaluation cycle, policy definitions with a DeployIfNotExists effect that match resources are marked as non-compliant, but no action is taken on that resource.
Incorrect:
*
EnforceRegoPolicy
No such thing in this context.
*
Modify
Modify is used to add, update, or remove properties or tags on a subscription or resource during creation or update. A common example is updating tags on resources such as costCenter.
Modify evaluation
Modify evaluates before the request gets processed by a Resource Provider during the creation or updating of a resource. The Modify operations are applied to the request content when the if condition of the policy rule is met. Each Modify
operation can specify a condition that determines when it's applied. Operations with false condition evaluations are skipped.
Box 2: The identity required to perform the remediation task
Policy assignments with effect set as DeployIfNotExists require a managed identity to do remediation.
Note: Each policy definition in Azure Policy has a single effect. That effect determines what happens when the policy rule is evaluated to match. The effects behave differently if they are for a new resource, an updated resource, or an existing
resource.
These effects are currently supported in a policy definition:
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Microsoft exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your AZ-305 exam preparations and Microsoft certification application, do not hesitate to visit our Vcedump.com to find your solutions here.