Universal Containers (UC) wants to capture information on how data entities are stored within the different applications and systems used within the company. For that purpose, the architecture team decided to create a data dictionary covering the main business domains within UC. Which two common techniques are used building a data dictionary to store information on how business entities are defined?
A. Use Salesforce Object Query Language.
B. Use a data definition language.
C. Use an entity relationship diagram.
D. Use the Salesforce Metadata API.
NTO processes orders from its website via an order management system (OMS). The OMS stores over 2 million historical records and is currently not integrated with SF. The Sales team at NTO using Sales cloud and would like visibility into related customer orders yet they do not want to persist millions of records directly in Salesforce. NTO has asked the data architect to evaluate SF connect and the concept of data verification. Which 3 considerations are needed prior to a SF Connect implementation? Choose 3 answers:
A. Create a 2nd system Admin user for authentication to the external source.
B. Develop an object relationship strategy.
C. Identify the external tables to sync into external objects
D. Assess whether the external data source is reachable via an ODATA endpoint.
E. Configure a middleware tool to poll external table data
Universal Containers wants to develop a dashboard in Salesforce that will allow Sales Managers to do data exploration using their mobile device (i.e., drill down into sales-related data) and have the possibility of adding ad-hoc filters while on the move. What is a recommended solution for building data exploration dashboards in Salesforce?
A. Create a Dashboard in an external reporting tool, export data to the tool, and add link to the dashboard in Salesforce.
B. Create a Dashboard in an external reporting tool, export data to the tool, and embed the dashboard in Salesforce using the Canval toolkit.
C. Create a standard Salesforce Dashboard and connect it to reports with the appropriate filters.
D. Create a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down.
As part of a phased Salesforce rollout. there will be 3 deployments spread out over the year. The requirements have been carefully documented. Which two methods should an architect use to trace back configuration changes to the detailed requirements? Choose 2 answers
A. Review the setup audit trail for configuration changes.
B. Put the business purpose in the Description of each field.
C. Maintain a data dictionary with the justification for each field.
D. Use the Force.com IDE to save the metadata files in source control.
A large retail company has recently chosen SF as its CRM solution. They have the following record counts: 2500000 accounts 25000000 contacts When doing an initial performance test, the data architect noticed an extremely slow response for reports and list views. What should a data architect do to solve the performance issue?
A. Load only the data that the users is permitted to access
B. Add custom indexes on frequently searched account and contact objects fields
C. Limit data loading to the 2000 most recently created records.
D. Create a skinny table to represent account and contact objects.
As part of addressing general data protection regulation (GDPR) requirements, UC plans to implement a data classification policy for all its internal systems that stores customer information including salesforce. What should a data architect recommend so that UC can easily classify consumer information maintained in salesforce under both standard and custom objects?
A. Use App Exchange products to classify fields based on policy.
B. Use data classification metadata fields available in field definition.
C. Create a custom picklist field to capture classification of information on customer.
D. Build reports for customer information and validate.
Universal Containers (UC) is a business that works directly with individual consumers (B2C). They are moving from a current home-grown CRM system to Salesforce. UC has about one million consumer records. What should the architect recommend for optimal use of Salesforce functionality and also to avoid data loading issues?
A. Create a Custom Object Individual Consumer c to load all individual consumers.
B. Load all individual consumers as Account records and avoid using the Contact object.
C. Load one Account record and one Contact record for each individual consumer.
D. Create one Account and load individual consumers as Contacts linked to that one Account.
UC is planning a massive SF implementation with large volumes of data. As part of the org’s implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner. What should a data architect do to minimize data load times due to system calculations?
A. Enable defer sharing calculations, and suspend sharing rule calculations
B. Load the data through data loader, and turn on parallel processing.
C. Leverage the Bulk API and concurrent processing with multiple batches
D. Enable granular locking to avoid “UNABLE _TO_LOCK_ROW” error.
Universal Containers (UC) has a data model as shown in the image. The Project object has a private sharing model, and it has Roll -Up summary fields to calculate the number of resources assigned to the project, total hours for the project, and the number of work items associated to the project. What should the architect consider, knowing there will be a large amount of time entry records to be loaded regularly from an external system into Salesforce.com?
A. Load all data using external IDs to link to parent records.
B. Use workflow to calculate summary values instead of Roll -Up.
C. Use triggers to calculate summary values instead of Roll -Up.
D. Load all data after deferring sharing calculations.
Universal Containers (UC) has implemented Salesforce, UC is running out of storage and needs to have an archiving solution, UC would like to maintain two years of data in Saleforce and archive older data out of Salesforce. Which solution should a data architect recommend as an archiving solution?
A. Use a third-party backup solution to backup all data off platform.
B. Build a batch join move all records off platform, and delete all records from Salesforce.
C. Build a batch join to move two-year-old records off platform, and delete records from Salesforce.
D. Build a batch job to move all restore off platform, and delete old records from Salesforce.