A large Automobile company has implemented SF for its Sales Associates. Leads flow from its website to SF using a batch integration in SF. The Batch job connects the leads to Accounts in SF. Customer visiting their retail stores are also created in SF as Accounts.
The company has noticed a large number of duplicate accounts in SF. On analysis, it was found that certain customers could interact with its website and also visit the store. The Sales associates use Global Search to search for customers in Salesforce before they create the customers.
Which scalable option should a data Architect choose to implement to avoid duplicates?
A. Create duplicate rules in SF to validate duplicates during the account creation process
B. Implement a MDM solution to validate the customer information before creating Accounts in SF.
C. Build Custom search based on fields on Accounts which can be matched with customer when they visit the store
D. Customize Account creation process to search if customer exists before creating an Account.
How can an architect find information about who is creating, changing, or deleting certain fields within the past two months?
A. Remove "customize application" permissions from everyone else.
B. Export the metadata and search it for the fields in question.
C. Create a field history report for the fields in question.
D. Export the setup audit trail and find the fields in question.
Universal Containers (UC) is concerned that data is being corrupted daily either through negligence or maliciousness. They want to implement a backup strategy to help recover any corrupted data or data mistakenly changed or even deleted. What should the data architect consider when designing a field -level audit and recovery plan?
A. Reduce data storage by purging old data.
B. Implement an AppExchange package.
C. Review projected data storage needs.
D. Schedule a weekly export file.
Universal Containers (UC) has a data model as shown in the image. The Project object has a private sharing model, and it has Roll -Up summary fields to calculate the number of resources assigned to the project, total hours for the project, and the number of work items associated to the project.
What should the architect consider, knowing there will be a large amount of time entry records to be loaded regularly from an external system into Salesforce.com?
A. Load all data using external IDs to link to parent records.
B. Use workflow to calculate summary values instead of Roll -Up.
C. Use triggers to calculate summary values instead of Roll -Up.
D. Load all data after deferring sharing calculations.
A large retail company has recently chosen SF as its CRM solution. They have the following record counts:
2500000 accounts 25000000 contacts
When doing an initial performance test, the data architect noticed an extremely slow response for reports and list views.
What should a data architect do to solve the performance issue?
A. Load only the data that the users is permitted to access
B. Add custom indexes on frequently searched account and contact objects fields
C. Limit data loading to the 2000 most recently created records.
D. Create a skinny table to represent account and contact objects.
A shipping and logistics company has created a large number of reports within Sales Cloud since Salesforce was introduced. Some of these reports analyze large amounts of data regarding the whereabouts of the company's containers, and they are starting to time out when users are trying to run the reports. What is a recommended approach to avoid these time-out issues?
A. Improve reporting performance by creating a custom Visualforce report that is using a cache of the records in the report.
B. Improve reporting performance by replacing the existing reports in Sales Cloud with new reports based on Analytics Cloud.
C. Improve reporting performance by creating an Apex trigger for the Report object that will pre-fetch data before the report is run.
D. Improve reporting performance by creating a dashboard that is scheduled to run the reports only once per day.
US has released a new disaster recovery (DR)policy that states that cloud solutions need a business continuity plan in place separate from the cloud providers built in data recovery solution.
Which solution should a data architect use to comply with the DR policy?
A. Leverage a 3rd party tool that extract salesforce data/metadata and stores the information in an external protected system.
B. Leverage salesforce weekly exports, and store data in Flat files on a protected system.
C. Utilize an ETL tool to migrate data to an on-premise archive solution.
D. Write a custom batch job to extract data changes nightly, and store in an external protected system.
NTO uses salesforce to manage relationships and track sales opportunities. It has 10 million customers and 100 million opportunities. The CEO has been complaining 10 minutes to run and sometimes failed to load, throwing a time out error.
Which 3 options should help improve the dashboard performance?
Choose 3 answers:
A. Use selective queries to reduce the amount of data being returned.
B. De-normalize the data by reducing the number of joins.
C. Remove widgets from the dashboard to reduce the number of graphics loaded.
D. Run the dashboard for CEO and send it via email.
E. Reduce the amount of data queried by archiving unused opportunity records.
As part of a phased Salesforce rollout. there will be 3 deployments spread out over the year. The requirements have been carefully documented. Which two methods should an architect use to trace back configuration changes to the detailed requirements? Choose 2 answers
A. Review the setup audit trail for configuration changes.
B. Put the business purpose in the Description of each field.
C. Maintain a data dictionary with the justification for each field.
D. Use the Force.com IDE to save the metadata files in source control.
Universal Containers (UC) is implementing a new customer categorization process where customers should be assigned to a Gold, Silver, or Bronze category if they've purchased UC's new support service. Customers are expected to be evenly distributed across all three categories. Currently, UC has around 500,000 customers, and is expecting 1% of existing non-categorized customers to purchase UC's new support service every month over the next five years. What is the recommended solution to ensure long-term performance, bearing in mind the above requirements?
A. Implement a new global picklist custom field with Gold, Silver, and Bronze values and enable it in Account.
B. Implement a new picklist custom field in the Account object with Gold, Silver, and Bronze values.
C. Implement a new Categories custom object and a master-detail relationship from Account to Category.
D. Implement a new Categories custom object and create a lookup field from Account to Category.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Salesforce exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DATA-ARCHITECT exam preparations and Salesforce certification application, do not hesitate to visit our Vcedump.com to find your solutions here.