NTO has 1 million customer records spanning 25 years. As part of its new SF project, NTO would like to create a master data management strategy to help preserve the history and relevance of its customer data.
Which 3 activities will be required to identify a successful master data management strategy? Choose 3 answers:
A. Identify data to be replicated
B. Create a data archive strategy
C. Define the systems of record for critical data
D. Install a data warehouse
E. Choose a Business Intelligence tool.
Correct Answer: ABC
Explanation: The three activities that will be required to identify a successful master data management strategy are:
Identify data to be replicated: This activity involves determining which data elements need to be copied from one system to another, and how frequently the replication should occur. This can help ensure data consistency and availability
across systems.
Create a data archive strategy: This activity involves defining how historical data will be stored, accessed, and deleted over time. This can help optimize data storage, performance, and compliance.
Define the systems of record for critical data: This activity involves identifying which system owns and maintains the authoritative version of each data element. This can help avoid data conflicts and duplication across systems67. Install a data
warehouse is not a required activity, but rather a possible option for consolidating data from multiple sources for analytics purposes. Choose a Business Intelligence tool is not a required activity, but rather a possible option for visualizing and
reporting on data from various sources.
Question 202:
Northern Trail Outfitter has implemented Salesforce for its associates nationwide, Senior management is concerned that the executive dashboard is not reliable for their real-time decision-making. On analysis, the team the following issues with data entered in Salesforce.
Information in certain records is incomplete.
Incorrect entry in certain fields causes records to be excluded in report fitters.
Duplicate entries cause incorrect counts.
Which three steps should a data architect recommend to address the issues?
A. Periodically export data to cleanse data and import them back into Salesforce for executive reports.
B. Build a sales data warehouse with purpose-build data marts for dashboards and senior management reporting.
C. Explore third-party data providers to enrich and augment information entered in salesforce.
D. Leverage Salesforce features, such as validate rules, to avoid incomplete and incorrect records.
E. design and implement data-quality dashboard to monitor and act on records that are incomplete or incorrect
Correct Answer: BCD
Explanation: According to the Salesforce documentation3, data quality is the measure of how well the data in Salesforce meets the expectations and requirements of the users and stakeholders. Data quality can be affected by various factors, such as data entry errors, data duplication, data inconsistency, data incompleteness, data timeliness, etc. To address the issues with data quality that affect the reliability of executive dashboards, a data architect should recommend: Building a sales data warehouse with purpose-built data marts for dashboards and senior management reporting (option B). This means creating a separate database or system that stores and organizes sales data from Salesforce and other sources for analytical purposes. A data warehouse can provide a single source of truth for sales data and enable faster and more accurate reporting and analysis. A data mart is a subset of a data warehouse that focuses on a specific subject or business area, such as sales performance, customer segmentation, product profitability, etc. A data mart can provide tailored and relevant data for different users or groups based on their needs and interests. Exploring third-party data providers to enrich and augment information entered in Salesforce (option C). This means using external services or tools that can validate, correct, update, and enhance the data that is entered or imported into Salesforce. This can help improve data quality and accuracy, and reduce data duplication and incompleteness. Leveraging Salesforce features, such as validation rules, to avoid incomplete and incorrect records (option D). This means using features that allow defining rules and criteria to validate the data that is entered or updated by the users or integrations. This can help prevent invalid or incorrect data from being saved, and trigger actions or alerts to correct or improve the data. Periodically exporting data to cleanse data and import them back into Salesforce for executive reports (option A) is not a good solution, as it can be time-consuming, error- prone, and inefficient. It may also cause data inconsistency and synchronization issues between Salesforce and other systems. Designing and implementing data-quality dashboard to monitor and act on records that are incomplete or incorrect (option E) is also not a good solution, as it can be complex, costly, and difficult to maintain. It may also not address the root causes of data quality issues or prevent them from occurring in the first place.
Question 203:
Universal Containers is experiencing frequent and persistent group membership locking issues that severely restricts its ability to manage manual and a automated updates at the same time.
What should a data architect do in order to restore the issue?
A. Enable granular locking
B. Enable parallel sharing rule calculation.
C. Enable defer sharing calculation
D. Enable implicit sharing
Correct Answer: A
Explanation: Enabling granular locking allows concurrent sharing rule calculations and group membership updates to run without locking each other1. This can help resolve the group membership locking issues that UC is experiencing.
Question 204:
Northern Trail Outfitters is streaming IoT data from connected devices to a cloud database. Every 24 hours. 100,000 records are generated.
NIO employees will need to see these lol records within Salesforce and generate weekly reports on it. Developers may also need to write programmatic logic to aggregate the records and incorporate them into workflows.
Which data pattern will allow a data architect to satisfy these requirements, while also keeping limits in mind?
A. Bidirectional integration
B. Unidirectional integration
C. Virtualization
D. Persistence
Correct Answer: D
Explanation: Persistence is the data pattern that will allow a data architect to satisfy the requirements, while also keeping limits in mind. Persistence means storing data from external sources in Salesforce objects, either standard or custom. This allows you to access the data within Salesforce and use it for reporting, analytics, workflows, and other features. Persistence also helps you avoid hitting API limits or performance issues when accessing large volumes of data from external systems. You can use various tools such as Data Loader, Bulk API, or Platform Events to persist IoT data from connected devices to a cloud database in Salesforce.
Question 205:
An Architect needs to document the data architecture for a multi-system, enterprise Salesforce implementation.
Which two key artifacts should the Architect use? (Choose two.)
A. User stories
B. Data model
C. Integration specification
D. Non-functional requirements
Correct Answer: BC
Explanation: Option B is correct because data model is a key artifact that an architect should use to document the data architecture for a multi-system, enterprise Salesforce implementation1. Data model describes the structure and relationship of data entities within an organization2. Option C is correct because integration specification is another key artifact that an architect should use to document the data architecture for a multi-system, enterprise Salesforce implementation1. Integration specification defines the scope, requirements, design, testing, and deployment of integration solutions between Salesforce and other systems3. Option A is not correct because user stories are not key artifacts for documenting the data architecture, but agile development tools that capture the features and functionalities that users want from a system4. Option D is not correct because non- functional requirements are not key artifacts for documenting the data architecture, but quality attributes that specify how well a system performs its functions.
Question 206:
What 2 data management policies does the data classification feature allow customers to classify in salesforce? Choose 2 answers:
A. Reference data policy.
B. Data governance policy.
C. Data sensitivity level
D. Compliance categorization policy.
Correct Answer: CD
Explanation: The data classification feature allows customers to classify their data in Salesforce based on two policies:
Data sensitivity level: This policy defines how sensitive the data is and what level of protection it requires. For example, high sensitivity data may require encryption or masking.
Compliance categorization policy: This policy defines how the data is regulated by various laws and standards. For example, GDPR or PCI DSS.
Question 207:
DreamHouse Realty has 15 million records in the Order_c custom object. When running a bulk query, the query times out.
What should be considered to address this issue?
A. Tooling API
B. PK Chunking
C. Metadata API
D. Streaming API
Correct Answer: B
Explanation: PK Chunking is a feature of the Bulk API that allows splitting a large query into smaller batches based on the primary key of the object. This can improve the performance and avoid query timeouts when querying large data sets
Question 208:
Which two aspects of data does an Enterprise data governance program aim to improve?
A. Data integrity
B. Data distribution
C. Data usability
D. Data modeling
Correct Answer: AC
Explanation: Data integrity and data usability are two aspects of data that an Enterprise data governance program aims to improve. Data integrity refers to the accuracy, consistency, and validity of the data across the enterprise2. Data usability refers to the ease of access, analysis, and interpretation of the data by the end users
Question 209:
Universal Container (UC) stores 10 million rows of inventory data in a cloud database, As part of creating a connected experience in Salesforce, UC would like to this inventory data to Sales Cloud without a import. UC has asked its data architect to determine if Salesforce Connect is needed.
Which three consideration should the data architect make when evaluating the need for Salesforce Connect?
A. You want real-time access to the latest data, from other systems.
B. You have a large amount of data and would like to copy subsets of it into Salesforce.
C. You need to expose data via a virtual private connection.
D. You have a large amount of data that you don't want to copy into your Salesforce org.
E. You need to small amounts of external data at any one time.
Correct Answer: ADE
Explanation: The correct answer is A, D, and E. The data architect should consider these three factors when evaluating the need for Salesforce Connect: You want real-time access to the latest data from other systems, you have a large amount of data that you don't want to copy into your Salesforce org, and you need to small amounts of external data at any one time. These factors indicate that Salesforce Connect is a suitable solution for creating a connected experience in Salesforce without importing inventory data from a cloud database. Salesforce Connect allows Salesforce to access external data via OData or custom adapters without storing it in Salesforce, which reduces storage costs and ensures data freshness. Salesforce Connect also supports pagination and caching to optimize performance when accessing small amounts of external data at any one time. Option B is incorrect because if you have a large amount of data and would like to copy subsets of it into Salesforce, you may not need Salesforce Connect but rather use other tools such as Data Loader or API integration. Option C is incorrect because if you need to expose data via a virtual private connection, you may not need Salesforce Connect but rather use other tools such as VPN or VPC peering.
Question 210:
A Salesforce customer has plenty of data storage. Sales Reps are complaining that searches are bringing back old records that aren't relevant any longer. Sales Managers need the data for their historical reporting. What strategy should a data architect use to ensure a better user experience for the Sales Reps?
A. Create a Permission Set to hide old data from Sales Reps.
B. Use Batch Apex to archive old data on a rolling nightly basis.
C. Archive and purge old data from Salesforce on a monthly basis.
D. Set data access to Private to hide old data from Sales Reps.
Correct Answer: C
Explanation: Archiving and purging old data from Salesforce on a monthly basis is a good strategy to improve the user experience for the Sales Reps, as it will reduce the clutter and improve the search performance. Creating a permission set or setting data access to private are not effective ways to hide old data from Sales Reps, as they will still consume data storage and affect search results. Using Batch Apex to archive old data on a rolling nightly basis is also not a good option, as it will consume API requests and processing time, and may not comply with the data retention policy.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Salesforce exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DATA-ARCHITECT exam preparations and Salesforce certification application, do not hesitate to visit our Vcedump.com to find your solutions here.