Mulesoft & AI: ingest  AI data as a stream to Salesforce Data Cloud!

Mulesoft & AI: ingest AI data as a stream to Salesforce Data Cloud!


Introduction

What is Salesforce Data Cloud? It is a hyperscale data platform integrated into Salesforce that allows you to connect your data with the CRM to enable your teams to act on data and insights obtained from the processes and applications Salesforce already in use.

Data ingest can take place through different streams (Database, Amazon AWS, Azure...etc) but not only! It is possible to use OpenAI to create data and use it as a stream for ingestion!


Scenario

First of all, we need to understand what type of data we need, what object, what fields, etc.

For example, let's consider that the reference object is 'customer' and the fields that interest us are the most common ones, such as 'First Name', 'Last Name', 'Date of Birth'...!

So let's write a 'prompt' for the AI like:

"Generate a sample synthetic dataset named customer in json with 5 rows, consisting of the following fields: 1. FirstName 2. LastName 3. Address 4. Email 5. DateOfBirth 6. PostalCode 7. CreatedDate"        


We will use this 'prompt' in the Request that will send to the AI and we will implement this subflow:


Article content


This subflow is called from the main flow and once the response has been obtained it will then be used to call the subflow inherent to Salesforce Data Cloud. This is the main flow:


Article content

The 'Set Object' variable is used to save the object name extrapolated from the Request and that we will use in order to build the request for Data Cloud and for its configuration. With this we can create adapatable flow usage depending on the object (customer,order,product...etc).

This is the Data Cloud subflow:


Article content


And this this is its configuration, also using the variable that we setted before:


Article content

Sò, in this case, if we consider the initial request to recall the main flow, we will have that the object will be 'customer' and consequently Data Cloud will be configured with:

Source API name: customer_demo
Objcet name: customerData        


And this information coincides with what we have configured on our Data Cloud:


Article content

The output of the main flow is this:


{
    "response_data_cloud": "accepted true",
    "data_ai": {
        "data": [
            {
                "FirstName": "John",
                "LastName": "Doe",
                "Address": "123 Main St",
                "Email": "johndoe@email.com",
                "DateOfBirth": "1985-05-15",
                "PostalCode": "12345",
                "CreatedDate": "2022-01-01"
            },
            {
                "FirstName": "Jane",
                "LastName": "Smith",
                "Address": "456 Oak Ave",
                "Email": "janesmith@email.com",
                "DateOfBirth": "1990-09-20",
                "PostalCode": "54321",
                "CreatedDate": "2022-01-05"
            },
            {
                "FirstName": "Michael",
                "LastName": "Johnson",
                "Address": "789 Elm Rd",
                "Email": "michaeljohnson@email.com",
                "DateOfBirth": "1980-02-10",
                "PostalCode": "98765",
                "CreatedDate": "2022-01-10"
            },
            {
                "FirstName": "Emily",
                "LastName": "Brown",
                "Address": "101 Pine Ln",
                "Email": "emilybrown@email.com",
                "DateOfBirth": "1995-11-25",
                "PostalCode": "13579",
                "CreatedDate": "2022-01-15"
            },
            {
                "FirstName": "David",
                "LastName": "Martinez",
                "Address": "202 Cedar Dr",
                "Email": "davidmartinez@email.com",
                "DateOfBirth": "1975-08-05",
                "PostalCode": "46820",
                "CreatedDate": "2022-01-20"
            }
        ]
    }
}        


Where:

"response_data_cloud" is the response obtained after calling Data Cloud using the connector and "data_ai" is the response that the AI return us and that we used as input for Data Cloud.

If we go to data Cloud and check what happened (after creating and configuring the Data Stream using the 'customer' Ingestion API), this is the result:


Article content



To view or add a comment, sign in

More articles by Francesco Suraci

Insights from the community

Others also viewed

Explore topics