A customer wants to use the transactional data from their data warehouse in Data Cloud. They are only able to export the data via an SFTP site. How should the file be brought into Data Cloud?
A. Ingest the file with the SFTP Connector.
B. Ingest the file through the Cloud Storage Connector.
C. Manually import the file using the Data Import Wizard.
D. Use Salesforce's Dataloader application to perform a bulk upload from a desktop.
Explanation:
A. The SFTP Connector is a data source connector that allows Data Cloud to ingest data from an SFTP server. The customer can use the SFTP Connector to create a data stream from their exported file and bring it into Data Cloud as a data lake object. The other options are not the best ways to bring the file into Data Cloud because:
B. The Cloud Storage Connector is a data source connector that allows Data Cloud to ingest data from cloud storage services such as Amazon S3, Azure Storage, or Google Cloud Storage. The customer does not have their data in any of these services, but only on an SFTP site.
C. The Data Import Wizard is a tool that allows users to import data for many standard Salesforce objects, such as accounts, contacts, leads, solutions, and campaign members. It is not designed to import data from an SFTP site or for custom objects in Data Cloud.
D. The Dataloader is an application that allows users to insert, update, delete, or export Salesforce records. It is not designed to ingest data from an SFTP site or into Data Cloud. References: SFTP Connector - Salesforce, Create Data Streams with the SFTP Connector in Data Cloud -Salesforce, Data Import Wizard - Salesforce, Salesforce Data Loader
A consultant wants to build a new audience in Data Cloud. Which three criteria can the consultant include when building a segment? Choose 3 answers
A. Direct attributes
B. Data stream attributes
C. Calculated Insights
D. Related attributes
E. Streaming insights
Explanation:
A segment is a subset of individuals who meet certain criteria based on their attributes and behaviors. A consultant can use different types of criteria when building a segment in Data Cloud, such as:
Direct attributes: These are attributes that describe the characteristics of an individual, such as name, email, gender, age, etc. These attributes are stored in the Profile data model object (DMO) and can be used to filter individuals based on their profile data.
Calculated Insights: These are insights that perform calculations on data in a data space and store the results in a data extension. These insights can be used to segment individuals based on metrics or scores derived from their data, such as customer lifetime value, churn risk, loyalty tier, etc.
Related attributes: These are attributes that describe the relationships of an individual with other DMOs, such as Email, Engagement, Order, Product, etc. These attributes can be used to segment individuals based on their interactions or transactions with different entities, such as email opens, clicks, purchases, etc.
The other two options are not valid criteria for building a segment in Data Cloud. Data stream attributes are attributes that describe the streaming data that is ingested into Data Cloud from various sources, such as Marketing Cloud, Commerce Cloud, Service Cloud, etc. These attributes are not directly available for segmentation, but they can be transformed and stored in data extensions using streaming data transforms. Streaming insights are insights that analyze streaming data in real time and trigger actions based on predefined conditions. These insights are not used for segmentation, but for activation and personalization.
References: Create a Segment in Data Cloud, Use Insights in Data Cloud, Data Cloud Data Model
A segment fails to refresh with the error "Segment references too many data lake objects (DLOS)". Which two troubleshooting tips should help remedy this issue? Choose 2 answers
A. Split the segment into smaller segments.
B. Use calculated insights in order to reduce the complexity of the segmentation query.
C. Refine segmentation criteria to limit up to five custom data model objects (DMOs).
D. Space out the segment schedules to reduce DLO load.
Explanation:
The error “Segment references too many data lake objects (DLOs)” occurs when a segment query exceeds the limit of 50 DLOs that can be referenced in a single query. This can happen when the segment has too many filters, nested segments, or exclusion criteria that involve different DLOs. To remedy this issue, the consultant can try the following troubleshooting tips:
Split the segment into smaller segments. The consultant can divide the segment into multiple segments that have fewer filters, nested segments, or exclusion criteria. This can reduce the number of DLOs that are referenced in each segment query and avoid the error. The consultant can then use the smaller segments as nested segments in a larger segment, or activate them separately.
Use calculated insights in order to reduce the complexity of the segmentation query. The consultant can create calculated insights that are derived from existing data using formulas. Calculated insights can simplify the segmentation query by replacing multiple filters or nested segments with a single attribute. For example, instead of using multiple filters to segment individuals based on their purchase history, the consultant can create a calculated insight that calculates the lifetime value of each individual and use that as a filter.
The other options are not troubleshooting tips that can help remedy this issue. Refining segmentation criteria to limit up to five custom data model objects (DMOs) is not a valid option, as the limit of 50 DLOs applies to both standard and custom DMOs. Spacing out the segment schedules to reduce DLO load is not a valid option, as the error is not related to the DLO load, but to the segment query complexity.
References:
Troubleshoot Segment Errors
Create a Calculated Insight
Create a Segment in Data Cloud
The Salesforce CRM Connector is configured and the Case object data stream is set up. Subsequently, a new custom field named Business Priority is created on the Case object in Salesforce CRM. However, the new field is not available when trying to add it to the data stream.
Which statement addresses the cause of this issue?
A. The Salesforce Integration User Is missing Rad permissions on the newly created field.
B. The Salesforce Data Loader application should be used to perform a bulk upload from a desktop.
C. Custom fields on the Case object are not supported for ingesting into Data Cloud.
D. After 24 hours when the data stream refreshes it will automatically include any new fields that were added to the Salesforce CRM.
The Salesforce CRM Connector uses the Salesforce Integration User to access the data from the Salesforce CRM org. The Integration User must have the Read permission on the fields that are included in the data stream. If the Integration User does not have the Read permission on the newly created field, the field will not be available for selection in the data stream configuration. To resolve this issue, the administrator should assign the Read permission on the new field to the Integration User profile or permission set. References: Create a Salesforce CRM Data Stream, Edit a Data Stream, Salesforce Data Cloud Full Refresh for CRM, SFMC, or Ingestion API Data Streams
A customer requests that their personal data be deleted. Which action should the consultant take to accommodate this request in Data Cloud?
A. Use a streaming API call to delete the customer's information.
B. Use Profile Explorer to delete the customer data from Data Cloud.
C. Use Consent API to request deletion of the customer's information.
D. Use the Data Rights Subject Request tool to request deletion of the customer's information.
Explanation:
The Data Rights Subject Request tool is a feature that allows Data Cloud users to manage customer requests for data access, deletion, or portability. The tool provides a user interface and an API to create, track, and fulfill data rights requests. The tool also generates a report that contains the customer’s personal data and the actions taken to comply with the request. The consultant should use this tool to accommodate the customer’s request for data deletion in Data Cloud. References: Data Rights Subject Request Tool, Create a Data Rights Subject Request.
Which consideration related to the way Data Cloud ingests CRM data is true?
A. CRM data cannot be manually refreshed and must wait for the next scheduled synchronization,
B. The CRM Connector's synchronization times can be customized to up to 15-minute intervals.
C. Formula fields are refreshed at regular sync intervals and are updated at the next full refresh.
D. The CRM Connector allows standard fields to stream into Data Cloud in real time.
A customer notices that their consolidation rate is low across their account unification. They have mapped Account to the Individual and Contact Point Email DMOs. What should they do to increase their consolidation rate?
A. Change reconciliation rules to Most Occurring.
B. Disable the individual identity ruleset.
C. Increase the number of matching rules.
D. Update their account address details in the data source
Explanation:
Consolidation Rate: The consolidation rate in Salesforce Data Cloud refers to the effectiveness of unifying records into a single profile. A low consolidation rate indicates that many records are not being successfully unified.
Matching Rules: Matching rules are critical in the identity resolution process. They define the criteria for identifying and merging duplicate records.
Solution:
Increase Matching Rules: Adding more matching rules improves the system's ability to identify duplicate records. This includes matching on additional fields or using more sophisticated matching algorithms.
Steps:
Access the Identity Resolution settings in Data Cloud.
Review the current matching rules.
Add new rules that consider more fields such as phone number, address, or other unique identifiers.
Benefits:
Improved Unification: Higher accuracy in matching and merging records, leading to a higher consolidation rate.
Comprehensive Profiles: Enhanced customer profiles with consolidated data from multiple sources.
Which two requirements must be met for a calculated insight to appear in the segmentation canvas? (Choose 2 answers)
A. The metrics of the calculated insights must only contain numeric values.
B. The primary key of the segmented table must be a metric in the calculated insight.
C. The calculated insight must contain a dimension including the Individual or Unified Individual Id.
D. The primary key of the segmented table must be a dimension in the calculated insight.
Explanation:
A calculated insight is a custom metric or measure that is derived from one or more data model objects or data lake objects in Data Cloud. A calculated insight can be used in segmentation to filter or group the data based on the calculated value. However, not all calculated insights can appear in the segmentation canvas. There are two requirements that must be met for a calculated insight to appear in the segmentation canvas:
The calculated insight must contain a dimension including the Individual or Unified Individual Id. A dimension is a field that can be used to categorize or group the data, such as name, gender, or location. The Individual or Unified Individual Id is a unique identifier for each individual profile in Data Cloud.
The calculated insight must include this dimension to link the calculated value to the individual profile and to enable segmentation based on the individual profile attributes. The primary key of the segmented table must be a dimension in the calculated insight. The primary key is a field that uniquely identifies each record in a table. The segmented table is the table that contains the data that is being segmented, such as the Customer or the Order table.
The calculated insight must include the primary key of the segmented table as a dimension to ensure that the calculated value is associated with the correct record in the segmented table and to avoid duplication or inconsistency in the segmentation results.
Northern Trail Outfitters wants to be able to calculate each customer's lifetime value (LTV) but also create breakdowns of the revenue sourced by website, mobile app, and retail channels. How should this use case be addressed in Data Cloud?
A. Nested segments
B. Flow orchestration
C. Streaming data transformations
D. Metrics on metrics
Explanation:
This feature can help Northern Trail Outfitters calculate each customer’s lifetime value (LTV) and create breakdowns of the revenue sourced by different channels. Streaming data transformations allow you to transform and enrich streaming data from different sources using formulas and operators.
Northern Trail Outfitters wants to use some of its Marketing Cloud data in Data Cloud. Which engagement channel data will require custom integration?
A. SMS
B. Email
C. CloudPage
D. Mobile push
Explanation:
CloudPage is a web page that can be personalized and hosted by Marketing Cloud. It is not one of the standard engagement channels that Data Cloud supports out of the box. To use CloudPage data in Data Cloud, a custom integration is required. The other engagement channels (SMS, email, and mobile push) are supported by Data Cloud and can be integrated using the Marketing Cloud Connector or the Marketing Cloud API.
Page 2 out of 14 Pages |
Previous |