Oracle Integration Cloud (OIC) as Low-code Platform

Oracle Integration Cloud (OIC) as Low-code Platform

OCI Streaming to ADW - Insert Master/Detail with OIC Gen3

OCI Integration Cloud (OIC) is a cloud native, serverless no-code/low-code service that provides an enterprise connectivity and automation platform for quickly modernizing applications, business processes, APIs, and data.

The prototype cookbook in the below sequence is derived from the real scenario Kafka->OIC->ADW as above title schema I will use simplified OCI resources replacement where OCI streaming which is compliant with Kafka is used as the data messaging/streaming service.

The following diagram depicts an architecture that follows this integration use case:

Article content

Join of the Payload List Objects to Autonomous Datawarehouse.

This cookbook aim is to show/explain how to use OIC to transform structured JSON payload which contains several data lists into the Insert into the database tables in ADW. The example is possible to use for inserting the data into the master-details respectively parent-child tables.

Today you will see low-code integration because a little code is needed in transformation of the JSON message to ADW.

To reflect the real world, you have simplified scenario where Betting System is producing Betting Tickets.

On that Betting Ticket you can find information which identifies the Ticket and the Customer.

As the children lists in the published message/streaming data, you can find Betting Picks (PICKS) from the list of available sport matches and competitions and customer guessed matches results Evaluations (EVAL). Target Analytics - Data Warehouse expects merged entities PICKS and EVALS.

Article content

Example of the simplified JSON payload:

Article content

Oracle Integration is subscribed to stream/topic. OIC consumes above JSON message from the stream/topic and then the data are published to Autonomous Data Warehouse in the form of TICKET entity and merged/joined TICKET_PICKS_EVALS.

Article content

Now you can create Integration that consumes message from the stream/Kafka Topic. And you will iterate over the consumed messages if you allowed to consume more than 1 message from the stream at once (because the typical polling frequency is 10 sec and to be able to consume and balance the workload over the time).

Your subscription to stream may look like bellow

Article content

You will iterate on top of consumed messages from the stream.

Article content

For each the consumed message you will insert the data to ADW.

Article content

You will choose the “Perform an operation on a Table” and the operation will be Insert. Next you will find and import the tables you will insert the data into.

Article content

After Importing the tables it may happen that you will be asked to provide primary key for the table where no primary key was created as in my case.

Article content

In this case the wizard asked for the primary key of the child table – this is for the validation resp. identifying the fields that we need to insert as mandatory.

Then yo are asked to select the parent table from the list of the chosen tables

Article content

In this case the TICKET table was selected as the parent.

In the generated mapping you will map visually what is possible. From the lists in the payload you can map from the list where the number of target fields possible to map is greater than from the other lists.

Article content

In the mapping sheet you can see on the target DB payload generated parent table and child table structures as in this case Ticket element and ticketPicksEvals element.

Where list is mapped the mapper will generate “for-each” directive in the code. We will switch to XSLT code (click the icon in red rectangle).

Article content

You will add the variable to use for the xpath selectors to choose the values from the different list based on the same “foreign key” pickID.

  <xsl:variable name="current_pickId" select="ns16:pickId"/>             
  <nstrgmpr:status>
    <xsl:value-of select="$f0_Message/nsmpr0:Message/nsmpr0:request-rapper/ns16:content/ns16:ticket/ns16:eval[ns16:pickId=$current_pickId]/ns16:status"/>
  </nstrgmpr:status>
</nstrgmpr:TicketPicksEvals>        

Now you should validate the Map and return to flow composer.

Article content

Final Integration looks still very simply.

You can start to publish the messages and your activated integration should transfer the data into the ADW.

In this case was triggered publisher service which publishes the tickets to the stream to see if the subscriber which stores the data to ADW is triggered and writes the data to ADW tables

Article content

The ticket with ticket_id=15 was published. In the OIC observability the Ticket consumer instance should appear.

Article content

We can see that integration run with success. We can check activity stream and the payload after the XSLT Transformation.

The ADW should show the data is present

One Invoke to ADW created records in the parent and child tables.

Summary

Now you know how to use OIC as low-code platform for the transformation of the structured payloads from messaging or REST interface and write the data into ADW using one invoke action using ADW Adapter Service.

Note: If your payload is huge e.g., contains thousands of list elements then it is recommended to model and use staging tables in the ADW and perform the outer joins an insert into the target model and tables.

References:

     Oracle Autonomous Data Warehouse Adapter

     OIC Apache Kafka Adapter   

     OCI Streaming with Apache Kafka

     Oracle Integration PM Blog


Credits: Niall Commiskey, Stan Tanev, Valeria Chiran, Andrew Bond, Marcel Straka, Deepak Arora, Juan Carlos Gonzáles Carrero, Harris Qureshi, Juergen Kress, Ravi Pinto, Giovanni Conte

#integration #EiPaaS #oraclecloud #oracleIntegration #OCI #LowCode #ProcesAutomation

To view or add a comment, sign in

More articles by Peter Obert

Insights from the community

Others also viewed

Explore topics