Universal Container has implemented Sales Cloud to manage patient and related health records. During a recent security audit of the system, it was discovered that same standard and custom fields need to encrypted.
Which solution should a data architect recommend to encrypt existing fields?
A. Use Apex Crypto Class encrypt customer and standard fields.
B. Implement classic encryption to encrypt custom and standard fields.
C. Implement shield platform encryption to encrypt and standard fields
D. Expert data out of Salesforce and encrypt custom and standard fields.
Correct Answer: C
Explanation: The correct answer is C, implement shield platform encryption to encrypt standard and custom fields. Shield platform encryption is a feature that allows you to encrypt sensitive data at rest in Salesforce without affecting its functionality. You can encrypt both standard and custom fields using shield platform encryption. Using Apex Crypto Class, implementing classic encryption, or exporting data out of Salesforce are not recommended solutions, as they will either limit your functionality, require custom code, or compromise your data security.
Question 192:
UC has the following system:
Billing system.
Customer support system.
CRM system.
US has been having trouble with business intelligence across the different systems.
Recently US implemented a master data management (MDM) solution that will be the system of truth for the customer records.
Which MDM data element is needed to allow reporting across these systems?
A. Global unique customer number.
B. Email address.
C. Phone number.
D. Full name.
Correct Answer: A
Explanation: The correct answer is A, global unique customer number. A global unique customer number is a data element that can uniquely identify each customer across different systems. It can be used as a key to link customer records from different sources and enable reporting across these systems. Email address, phone number, and full name are not reliable or consistent identifiers for customers, as they can change over time or be shared by multiple customers.
Question 193:
Universal Containers has defined a new Data Quality Plan for their Salesforce data and wants to know how they can enforce it throughout the organization. Which two approaches should an architect recommend to enforce this new plan?
Choose 2 answers
A. Schedule a weekly dashboard displaying records that are missing information to be sent to managers for review.
B. Use Workflow, Validation Rules, and Force.com code (Apex) to enforce critical business processes.
C. Schedule reports that will automatically catch duplicates and merge or delete the records every week.
D. Store all data in an external system and set up an integration to Salesforce for view - only access.
Correct Answer: AB
Explanation: Scheduling a weekly dashboard displaying records that are missing information to be sent to managers for review and using Workflow, Validation Rules, and Force.com code (Apex) to enforce critical business processes are two approaches that an architect should recommend to enforce the new Data Quality Plan for UC's Salesforce data. Scheduling a weekly dashboard can provide a regular and visual way of monitoring the data quality and identifying any gaps or issues that need to be addressed by the managers or users. Using Workflow, Validation Rules, and Apex can provide various ways of enforcing data quality standards and business logic by automating actions, displaying error messages, or executing custom code when users create or edit records. The other options are not suitable or helpful for enforcing the Data Quality Plan, as they would either not provide real-time feedback, not prevent data quality issues, or not leverage the capabilities of Salesforce
Question 194:
Northern Trail Outfitters (NTO) wants to start a loyalty program to reward repeat customers. The program will track every item a customer has bought and grants them points for discounts. The following conditions will exist upon implementation:
Data will be used to drive marketing and product development initiatives.
NTO estimates that the program will generate 100 million rows of date monthly.
NTO will use Salesforce's Einstein Analytics and Discovery to leverage their data and make business and marketing decisions.
What should the Data Architect do to store, collect, and use the reward program data?
A. Create a custom big object in Salesforce which will be used to capture the Reward Program data for consumption by Einstein.
B. Have Einstein connect to the point of sales system to capture the Reward Program data.
C. Create a big object in Einstein Analytics to capture the Loyalty Program data.
D. Create a custom object in Salesforce that will be used to capture the Reward Program data.
Correct Answer: A
Explanation: According to the official Salesforce guide1, big objects are designed to store and manage massive data volumes within Salesforce without affecting performance. They can be queried by using Async SOQL or standard SOQL, and they can be accessed by using Apex, Visualforce, Lightning components, or APIs. Big objects are ideal for storing data that is used for analytics or reporting purposes, such as the reward program data. Option A is the correct answer because it allows NTO to create a custom big object in Salesforce that can store the reward program data and make it available for consumption by Einstein Analytics and Discovery. Option B is incorrect because Einstein cannot connect directly to the point of sales system to capture the reward program data. Option C is incorrect because Einstein Analytics does not support creating big objects. Option D is incorrect because custom objects are not suitable for storing large volumes of data.
Question 195:
Universal Containers (UC) is implementing a formal, cross -business -unit data governance program As part of the program, UC will implement a team to make decisions on enterprise -wide data governance. Which two roles are appropriate as members of this team? Choose 2 answers
A. Analytics/BI Owners
B. Data Domain Stewards
C. Salesforce Administrators
D. Operational Data Users
Correct Answer: AB
Explanation: Analytics/BI Owners and Data Domain Stewards are appropriate roles as members of a team that makes decisions on enterprise-wide data governance. Analytics/BI Owners are responsible for defining the business requirements and metrics for data analysis and reporting, and Data Domain Stewards are responsible for defining and enforcing the data quality standards and rules for specific data domains. Salesforce Administrators and Operational Data Users are not suitable roles for this team, as they are more focused on the operational aspects of data management, such as configuration, maintenance, and usage.
Question 196:
Universal Containers has millions of rows of data in Salesforce that are being used in reports to evaluate historical trends. Performance has become an issue, as well as data storage limits. Which two strategies should be recommended when talking with stakeholders?
A. Use scheduled batch Apex to copy aggregate information into a custom object and delete the original records.
B. Combine Analytics Snapshots with a purging plan by reporting on the snapshot data and deleting the original records.
C. Use Data Loader to extract data, aggregate it, and write it back to a custom object, then delete the original records.
D. Configure the Salesforce Archiving feature to archive older records and remove them from the data storage limits.
Correct Answer: AD
Explanation: Using scheduled batch Apex to copy aggregate information into a custom object and delete the original records can improve the performance and reduce the data storage limits by removing unnecessary data and keeping only the summary data that is needed for reporting. Configuring the Salesforce Archiving feature to archive older records and remove them from the data storage limits can also help with performance and storage issues by moving historical data to a separate system that is still accessible but does not affect the operational data
Question 197:
An architect has been asked by a client to develop a solution that will integrate data and resolve duplicates and discrepancies between Salesforce and one or more external systems. What two factors should the architect take into consideration when deciding whether or not to use a Master Data Management system to achieve this solution?
Choose 2 answers
A. Whether the systems are cloud -based or on -premise.
B. Whether or not Salesforce replaced a legacy CRM.
C. Whether the system of record changes for different tables.
D. The number of systems that are integrating with each other.
Correct Answer: CD
Explanation: Whether the system of record changes for different tables and the number of systems that are integrating with each other are two factors that the architect should take into consideration when deciding whether or not to use a Master Data Management system to achieve the solution of integrating data and resolving duplicates and discrepancies between Salesforce and one or more external systems. The system of record is the authoritative source of truth for a given entity or field in a given context. If different systems have different systems of record for different tables, then a Master Data Management system can help to manage and synchronize the data across systems and ensure data quality and consistency. The number of systems that are integrating with each other is another factor that affects the complexity and scalability of the integration solution. If there are many systems that need to integrate with each other, then a Master Data Management system can provide a centralized and standardized way of integrating data and resolving duplicates and discrepancies across systems. The other factors are not relevant or important for deciding whether or not to use a Master Data Management system, as they do not affect the data quality or integration challenges that a Master Data Management system can address.
Question 198:
NTO has outgrown its current salesforce org and will be migrating to new org shortly. As part of this process NTO will be migrating all of its metadata and data. NTO's data model in the source org has a complex relationship hierarchy with several master detail and lookup relationships across objects, which should be maintained in target org.
What 3 things should a data architect do to maintain the relationship hierarchy during migration?
Choose 3 answers:
A. Use data loader to export the data from source org and then import or Upsert into the target org in sequential order.
B. Create a external id field for each object in the target org and map source record ID's to this field.
C. Redefine the master detail relationship fields to lookup relationship fields in the target org.
D. Replace source record ID's with new record ID's from the target org in the import file.
E. Keep the relationship fields populated with the source record ID's in the import file.
Correct Answer: ABD
Explanation: The correct answer is A, B, and D. To maintain the relationship hierarchy during migration, a data architect should use data loader to export the data from source org and then import or upsert into the target org in sequential order, create an external ID field for each object in the target org and map source record IDs to this field, and replace source record IDs with new record IDs from the target org in the import file. These steps will ensure that the records are linked correctly and the relationships are preserved. Option C is incorrect because redefining the master detail relationship fields to lookup relationship fields in the target org will change the behavior and security of the data model. Option E is incorrect because keeping the relationship fields populated with the source record IDs in the import file will cause errors and prevent the records from being imported.
Question 199:
Universal Containers wants to develop a dashboard in Salesforce that will allow Sales Managers to do data exploration using their mobile device (i.e., drill down into sales-related data) and have the possibility of adding ad-hoc filters while on the move. What is a recommended solution for building data exploration dashboards in Salesforce?
A. Create a Dashboard in an external reporting tool, export data to the tool, and add link to the dashboard in Salesforce.
B. Create a Dashboard in an external reporting tool, export data to the tool, and embed the dashboard in Salesforce using the Canval toolkit.
C. Create a standard Salesforce Dashboard and connect it to reports with the appropriate filters.
D. Create a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down.
Correct Answer: D
Explanation: Creating a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down is a recommended solution for building data exploration dashboards in Salesforce. Analytics Cloud is a powerful data analysis tool that enables users to explore data using interactive dashboards, charts, graphs, and tables on any device. Users can also create lenses, which are ad-hoc data queries that can be saved and reused, and drill down into data details using filters and facets. Creating a Dashboard in an external reporting tool, exporting data to the tool, and adding link to the dashboard in Salesforce will not provide a seamless user experience and may require additional data integration and security considerations. Creating a Dashboard in an external reporting tool, exporting data to the tool, and embedding the dashboard in Salesforce using the Canval toolkit will not provide a native Salesforce solution and may require additional data integration and security considerations. Creating a standard Salesforce Dashboard and connecting it to reports with the appropriate filters will not allow the user to create ad-hoc lenses and drill down into data details on their mobile device.
Question 200:
Universal Containers (UC) is a business that works directly with individual consumers (B2C). They are moving from a current home-grown CRM system to Salesforce. UC has about one million consumer records. What should the architect recommend for optimal use of Salesforce functionality and also to avoid data loading issues?
A. Create a Custom Object Individual Consumer c to load all individual consumers.
B. Load all individual consumers as Account records and avoid using the Contact object.
C. Load one Account record and one Contact record for each individual consumer.
D. Create one Account and load individual consumers as Contacts linked to that one Account.
Correct Answer: D
Explanation: According to the exam guide, one of the objectives is to "describe best practices for implementing a single-org strategy in a B2C scenario"1. This implies that option D is the best practice for loading individual consumers as contacts in Salesforce. This approach avoids creating unnecessary accounts and reduces data duplication. Option C is not correct because it creates one account per contact, which increases data volume and complexity. Options A and B are not correct because they do not leverage the standard contact object, which provides native functionality and integration with other Salesforce features.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Salesforce exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DATA-ARCHITECT exam preparations and Salesforce certification application, do not hesitate to visit our Vcedump.com to find your solutions here.