Data Cloud Data Strategy

ยท

6 min read

Customer-Centric Data Strategies

Engaging and satisfying your customers requires a thoughtful approach to handling data. A good data strategy helps with a few key things:

  1. Understanding your customers better: It helps you get a complete picture of your customers, their preferences, and behaviors.

  2. Tailoring your products and services: It lets you adjust what you offer to match what your customers want.

  3. Enhancing customer service: It helps improve how you assist and support your customers.

Data architecture involves:

  1. Deciding what data your company collects and processes.

  2. Figuring out how to store that data.

  3. Managing how data moves around your organization.

When you're setting up your data architecture with Salesforce, remember to focus on your customers. This is super important because it helps your business in these ways:

  1. Get More Value: By keeping customers in mind, you make sure your plans are effective, saving time and money.

  2. Don't Miss Anything: Understanding how customers interact with your company helps you avoid mistakes and missed opportunities in their experience.

  3. Make Smart Choices: Your decisions should aim at making customers happy. Your data setup should help with that.

Data Flow

Data Flow in Data Cloud means you can use Data Cloud to both bring in and send out data. You have the option to either bring data into Data Cloud and store it there, or you can connect to it using descriptions (metadata) and quering it without actually saving it in Data Cloud. This flexibility allows you to work with your data in the way that best suits your needs.

You can connect and ingest your data in various ways:

  • Salesforce Connectors

  • Connector Services

  • Connectors and Integrations

Data Targets and Data Shares

Once the data is processed, organized, or split into segments within Data Cloud, you can then send it to different destinations or share it with other systems:

  1. Send to SFTP: You can send the data to a Secure File Transfer Protocol (SFTP) server for storage or further processing.

  2. Share with Integrations (e.g., Snowflake): You can also share the data with other integrated systems, like Snowflake, a popular data warehousing platform, where it can be used for various purposes such as analysis and reporting.

Targets serve several important purposes:

  1. Activate Segments: You can use targets to activate data segments in various platforms like Marketing Cloud Engagement, Amazon S3, SFTP, Google Cloud Storage (GCS), and Microsoft Azure Blob Storage. This means you can send your segmented data to these platforms for various purposes.

  2. Activate Segments in Advertising Platforms: Targets also enable you to activate segments in advertising platforms like Meta (formerly known as Facebook). This allows you to use your data for targeted advertising campaigns.

  3. Perform Actions in Sales or Service Cloud: You can use targets to perform actions in Sales Cloud or Service Cloud using your data. This might involve tasks related to customer relationship management or service operations.

  4. Analyze Data in External Tools: Lastly, targets let you analyze your data within external tools like Tableau or CRM Analytics. This means you can take your data and use it for in-depth analysis and reporting using these tools.

Data Model Concepts

With Data Cloud, you have the ability to link data between the data sources you've connected and the Customer 360 Data Model. This means you can establish connections and relationships between your various sources of data and the central Customer 360 Data Model, allowing for a unified and comprehensive view of your customer data.

The Customer 360 Data Model is a standard data framework within Data Cloud that makes it easier to work with data across different applications. This model simplifies the process of integrating data from various cloud-based applications by offering standardized rules for data compatibility. You can use this model as a foundation and expand it to create data lakes, perform analytics, train machine learning models, establish a unified customer profile, and more.

The Customer 360 Data Model is structured into different subject areas, each representing a key business concept. These subject areas cover things like customer information, product data, and engagement details, making it simpler to organize and understand your data in a meaningful way.

**Subject Area in Customer 360 groups similar Data Model Objects (DMOs) together into what are we called as data model subject areas.

Data model objects Objects in the data model created by the customer for Data Cloud implementation are called Data Model Objects. If a new object is created, it can use a reference object. If a Data Model Object uses a reference object, it inherits the name, shape, and semantics of the reference object. This Data Model Object is called a standard object. You can also choose to define an entirely custom Data Model Object, called a custom object.

Data Ingestion and Modeling

Data Ingestion

This is the initial step where data is collected from a data source without any alterations. It means that the fields and their data types are imported exactly as they are. To perform this, you establish a data stream that brings in the data into Data Cloud. All data ingested through these data streams is then written into data lake objects (DLOs).

Data Modelling

In the data modeling phase, the original source schemas are transformed and organized into standardized or customizable data models. These data models are typically based on the Customer 360 Data Model, which provides clear and consistent data guidelines. During this phase, connections are established between data that comes from different sources, and this is done by mapping Data Lake Objects (DLOs) to Data Model Objects (DMOs).

A Data Model Object (DMO) is essentially a structured grouping of data that is derived from various sources, including data streams and insights. This phase helps ensure that the data is structured in a consistent and meaningful way, making it easier to work with and derive insights from disparate data sources.

Data Object In Data Cloud

A Data Source Object (DSO) is like a bucket of data, such as customer purchase information. This data comes into Data Cloud from a data stream or data bundle. Essentially, a DSO is a way to represent the data that has been brought into the system.

A Data Lake Object (DLO) is like a storage container for the data that enters Data Cloud. It's created automatically when data is brought in from a Data Source Object (DSO), but you can also create it manually if needed. Think of it as a place where the data is stored and managed within the system.

A Data Model Object (DMO) is like a structured collection of data that's been organized and grouped together from various sources like data streams and insights. It's a way to make the data more organized and understandable. To make this happen, a Data Lake Object (DLO) is linked or mapped to a DMO, which helps in structuring and managing the data effectively.

ย