UC has one SF org (Org A) and recently acquired a secondary company with its own Salesforce org (Org B). UC has decided to keep the orgs running separately but would like to bidirectionally share opportunities between the orgs in near-real time.
Which 3 options should a data architect recommend to share data between Org A and Org B?
Choose 3 answers.
A. Leverage Heroku Connect and Heroku Postgres to bidirectionally sync Opportunities.
B. Install a 3rd party AppExchange tool to handle the data sharing
C. Develop an Apex class that pushes opportunity data between orgs daily via the Apex schedule.
D. Leverage middleware tools to bidirectionally send Opportunity data across orgs.
E. Use Salesforce Connect and the cross-org adapter to visualize Opportunities into external objects
Correct Answer: ADE
Explanation: Leveraging Heroku Connect and Heroku Postgres, middleware tools, or Salesforce Connect and the cross-org adapter are all viable options to share data between Org A and Org B in near-real time3. Installing a 3rd party AppExchange tool may not provide bidirectional sync or may have additional costs. Developing an Apex class that pushes opportunity data between orgs daily via the Apex schedule may not meet the near- real time requirement.
Question 132:
Northern Trail Outfitters (NTO) has the following systems:
Customer master-source of truth for customer information Service cloud-customer support
Marketing cloud-marketing support
Enterprise data warehouse--business reporting
The customer data is duplicated across all these systems and are not kept in sync. Customers are also complaining that they get repeated marketing emails and have to call into update their information.
NTO is planning to implement master data management (MDM) solution across the enterprise.
Which three data will an MDM tool solve?
Choose 3 answers
A. Data completeness
B. Data loss and recovery
C. Data duplication
D. Data accuracy and quality E. Data standardization
Correct Answer: CDE
Explanation: According to the What is Master Data Management (MDM)? article, some of the data challenges that an MDM tool can solve are data duplication, data accuracy and quality, and data standardization. The article states that "MDM solutions comprise a broad range of data cleansing, transformation, and integration practices. As data sources are added to the system, MDM initiates processes to identify, collect, transform, and repair data. Once the data meets the quality thresholds, schemas and taxonomies are created to help maintain a high-quality master reference." Therefore, an MDM tool can help NTO eliminate data duplication across different systems, improve data accuracy and quality by removing errors and inconsistencies, and standardize data formats and definitions for better integration and analysis.
Question 133:
UC is rolling out Sales App globally to bring sales teams together on one platform. UC expects millions of opportunities and accounts to be creates and is concerned about the performance of the application.
Which 3 recommendations should the data architect make to avoid the data skew? Choose 3 answers.
A. Use picklist fields rather than lookup to custom object.
B. Limit assigning one user 10000 records ownership.
C. Assign 10000 opportunities to one account.
D. Limit associating 10000 opportunities to one account.
E. Limit associating 10000 records looking up to same records.
Correct Answer: BDE
Explanation: Data skew occurs when a large number of child records are associated with a single parent record, or when a single user owns a large number of records. This can cause performance issues and lock contention. To avoid data skew, the data architect should limit assigning one user 10,000 records ownership, limit associating 10,000 opportunities to one account, and limit associating 10,000 records looking up to the same record
Question 134:
A company wants to document the data architecture of a Salesforce organization.
What are two valid metadata types that should be included? (Choose two.)
A. RecordType
B. Document
C. CustomField
D. SecuritySettings
Correct Answer: AC
Explanation: Option A is correct because RecordType is a valid metadata type that should be included in documenting the data architecture of a Salesforce organization1. RecordType defines different business processes, picklist values, and page layouts for different users2. Option C is correct because CustomField is another valid metadata type that should be included in documenting the data architecture of a Salesforce organization1. CustomField defines custom attributes for standard or custom objects3. Option B is not correct because Document is not a valid metadata type, but a standard object that stores documents in folders4. Option D is not correct because SecuritySettings is not a valid metadata type, but a setup menu that allows administrators to configure various security features such as password policies, network access, session settings, etc.
Question 135:
Northern Trail Outfitters Is planning to build a consent form to record customer authorization for marketing purposes.
What should a data architect recommend to fulfill this requirement?
A. Use custom fields to capture the authorization details.
B. Create a custom object to maintain the authorization.
C. Utilize the Authorization Form Consent object to capture the consent.
D. Use AppExchange solution to address the requirement.
Correct Answer: C
Explanation: The Authorization Form Consent object is a standard object that allows you to capture customer consent for marketing purposes. It has fields such as Consent Captured Date, Consent Captured Source, Consent Description, and Consent Status. You can use this object to create consent forms and track customer responses. This is the best option to fulfill the requirement, as it does not require any custom development or external solution.
Question 136:
UC has large amount of orders coming in from its online portal. Historically all order are assigned to a generic user.
Which 2 measures should data architect recommend to avoid any performance issues while working with large number of order records? Choose 2 answers:
A. Clear the role field in the generic user record.
B. Salesforce handles the assignment of orders automatically and there is no performance impact.
C. Create a role at top of role hierarchy and assign the role to the generic user.
D. Create a pool of generic users and distribute the assignment of memory to the pool of users.
Correct Answer: AC
Explanation: Clearing the role field in the generic user record and creating a role at the top of the role hierarchy and assigning it to the generic user are two measures that can help avoid performance issues while working with large number of order records. These measures can prevent data skew and lock contention that may occur when a single user owns or shares a large number of records
Question 137:
Which three characteristics of a skinny table help improve report and query performance?
A. Skinny tables can contain frequently used fields and thereby help avoid joins.
B. Skinny tables can be used to create custom indexes on multi-select picklist fields.
C. Skinny tables provide a view across multiple objects for easy access to combined data.
D. Skinny tables are kept in sync with changes to data in the source tables.
E. Skinny tables do not include records that are available in the recycle bin.
Correct Answer: ADE
Explanation: The three characteristics of a Skinny table that help improve report and query performance are: Skinny tables can contain frequently used fields and thereby help avoid joins. Skinny tables are kept in sync with changes to data in the source tables. Skinny tables do not include records that are available in the recycle bin. These characteristics are beneficial because they reduce the query complexity and execution time, and improve the data accuracy and freshness. For example, skinny tables can contain frequently used fields from multiple objects, such as Account and Contact, and thereby help avoid joins that can slow down queries4. Skinny tables are updated automatically when the source tables are modified, so they always reflect the latest data5. Skinny tables do not include records that are available in the recycle bin, so they only contain active records that are relevant for reports and queries.
Question 138:
Cloud Kicks has the following requirements:
Their Shipment custom object must always relate to a Product, a Sender, and a Receiver (all separate custom objects).
If a Shipment is currently associated with a Product, Sender, or Receiver, deletion of those records should not be allowed.
Each custom object must have separate sharing models.
What should an Architect do to fulfill these requirements?
A. Associate the Shipment to each parent record by using a VLOOKUP formula field.
B. Create a required Lookup relationship to each of the three parent records.
C. Create a Master-Detail relationship to each of the three parent records.
D. Create two Master-Detail and one Lookup relationship to the parent records.
Correct Answer: B
Explanation: A required Lookup relationship ensures that the Shipment record must have a value for each of the three parent records, and also prevents the deletion of those parent records if they are referenced by a Shipment record. A Master-Detail relationship would not allow separate sharing models for each custom object, and a VLOOKUP formula field would not enforce the relationship or prevent deletion
Question 139:
Northern Trail Outfitters (NTO) has an external product master system that syncs product and pricing information with Salesforce. Users have been complaining that they are seeing discrepancies in product and pricing information displayed on the NTO website and Salesforce.
As a data architect, which action is recommended to avoid data sync issues?
A. Build a custom integration for two-way sync of product and pricing information between product master to Salesforce.
B. Build a custom integration for one-way sync of product and pricing information from product master to Salesforce.
C. Implement a manual process to update the products from an extract from the products master on a weekly basis.
D. Use the Customer 360 data manager to sync product and pricing information from product master database to Salesforce.
Correct Answer: D
Explanation: According to Trailhead2, Customer 360 Data Manager is a feature that allows administrators to connect, reconcile, and share customer data across Salesforce orgs and external systems. Customer 360 Data Manager can sync product and pricing information from product master database to Salesforce using predefined mappings and rules, ensuring data quality and consistency. Option D is the correct answer because it suggests using Customer 360 Data Manager to sync product and pricing information from product master database to Salesforce. Option A is incorrect because building a custom integration for two-way sync of product and pricing information between product master and Salesforce can introduce data conflicts and duplication issues, as both systems can update the same data independently. Option B is incorrect because building a custom integration for one-way sync of product and pricing information from product master to Salesforce can be costly and time-consuming, as it requires custom development and maintenance. Option C is incorrect because implementing a manual process to update the products from an extract from the products master on a weekly basis can result in data latency and errors, as the products information may change more frequently than once a week.
Question 140:
Universal Containers (UC) uses the following Salesforce products:
Sales Cloud for customer management.
Marketing Cloud for marketing.
Einstein Analytics for business reporting.
UC occasionally gets a list of prospects from third-party source as comma-separated values (CSV) files for marketing purposes. Historically, UC would load contact Lead object in Salesforce and sync to Marketing Cloud to send marketing
communications. The number of records in the Lead object has grown over time and has been consuming large amounts of storage in Sales Cloud, UC is looking for recommendations to reduce the storage and advice on how to optimize the
marketing Cloud to send marketing communications. The number of records in the Lead object has grown over time and has been consuming large amounts of storage in Sales Cloud, UC is looking for recommendations to reduce the storage
and advice on how to optimize the marketing process.
What should a data architect recommend to UC in order to immediately avoid storage issues in the future?
A. Load the CSV files in Einstein Analytics and sync with Marketing Cloud prior to sending marketing communications;
B. Load the CSV files in an external database and sync with Marketing Cloud prior to sending marketing communications.
C. Load the contacts directly to Marketing Cloud and have a reconciliation process to track prospects that are converted to customers.
D. Continue to use the existing process to use Lead object to sync with Marketing Cloud and delete Lead records from Sales after the sync is complete.
Correct Answer: C
Explanation: According to the Salesforce documentation4, Marketing Cloud is a platform that allows creating and managing marketing campaigns across multiple channels, such as email, mobile, social media, web, etc. Marketing Cloud can integrate with Sales Cloud and other Salesforce products to share data and insights. One of the ways to integrate Marketing Cloud with Sales Cloud is using Marketing Cloud Connect5, which allows syncing data between the two platforms using synchronized data sources. However, if UC occasionally gets a list of prospects from third-party sources as CSV files for marketing purposes, it may not be necessary or efficient to load them into Sales Cloud first and then sync them with Marketing Cloud. This can consume large amounts of storage in Sales Cloud, which has a limit based on the license type6. It can also cause data quality issues, such as duplicates or outdated information. A better option for UC is to load the contacts directly to Marketing Cloud using Import Definition, which allows importing data from external files or databases into Marketing Cloud data extensions. Data extensions are custom tables that store marketing data in Marketing Cloud. This way, UC can avoid storage issues in Sales Cloud and optimize the marketing process by sending marketing communications directly from Marketing Cloud. To track prospects that are converted to customers, UC can have a reconciliation process that compares the contacts in Marketing Cloud with the accounts or contacts in Sales Cloud. This can be done using SQL queries or API calls to access and compare data from both platforms. Alternatively, UC can use Marketing Cloud Connect to sync the converted contacts from Sales Cloud to Marketing Cloud using synchronized data sources. Loading the CSV files in Einstein Analytics and syncing with Marketing Cloud prior to sending marketing communications (option A) is not a good option, as it can add unnecessary complexity and latency to the process. Einstein Analytics is a platform that allows creating and analyzing data using interactive dashboards and reports. It is not designed for importing and exporting data for marketing purposes. Loading the CSV files in an external database and syncing with Marketing Cloud prior to sending marketing communications (option B) is also not a good option, as it can incur additional costs and maintenance for the external database. It can also introduce data security and privacy risks, as the data may not be encrypted or protected by Salesforce. Continuing to use the existing process to use Lead object to sync with Marketing Cloud and delete Lead records from Sales after the sync is complete (option D) is not a good option, as it can cause performance issues and data loss. Deleting Lead records from Sales can affect reporting and auditing, as well as trigger workflows and validations that may not be intended. It can also cause data inconsistency and synchronization errors between Sales Cloud and Marketing Cloud.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Salesforce exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DATA-ARCHITECT exam preparations and Salesforce certification application, do not hesitate to visit our Vcedump.com to find your solutions here.