Friday, August 22, 2025

OIC - Sending JSON Data as Email Body Content via Twilio SendGrid API in OIC

Use Case Description:

When integrating Oracle Integration Cloud (OIC) with Twilio SendGrid API to send emails, JSON data intended for the email body is sometimes sent as an attachment instead of inline content. This issue occurs because the API interprets the payload as attachment content, especially if the content type isn't correctly specified. The goal is to send JSON data directly within the email body using the string() function and setting the Content-Type as text/plain or application/json.

Solution Steps:
1. Serialize JSON Data to String
Use the string() function in OIC to convert your JSON object into a string format suitable for embedding in the email body.
2. Structure the Email Payload
Prepare the payload adhering to SendGrid's API format:Set "content" with "type": "text/plain" or "application/json".
Include the serialized JSON string in the "value" field.
Example payload snippet:
json
Copy
{ "personalizations": [ { "to": [{"email": "recipient@example.com"}], "subject": "JSON Data Email" } ], "from": {"email": "sender@example.com"}, "content": [ { "type": "text/plain", // or "application/json" based on requirement "value": "{your JSON string here}" } ] }
3. Setup API Headers
Ensure the HTTP headers include:
Content-Type: application/json
4. Make the API Call in OIC
Use an HTTP action to POST the above payload to SendGrid's API endpoint (https://api.sendgrid.com/v3/mail/send).
Pass the API key in Authorization headers.
5. Validate the Email Content
Check the received email to confirm that JSON data appears inline in the email body, not as an attachment.
Summary:
By serializing JSON data with string(), structuring the payload correctly, and setting the content type appropriately, you can send JSON data directly as the email body in Twilio SendGrid API through OIC, avoiding it being treated as an attachment.

Wednesday, August 13, 2025

OIC - How to Reprocess an HCM Extract in OIC Without Resubmitting the Flow

Use Case

In Oracle HCM integrations, it’s common to schedule extracts and process their output to a target destination like Oracle Object Storage. However, in real-world scenarios, the extract may fail in the middle of the process — for example, due to downstream errors — even though the extract itself completed successfully in HCM.

When this happens, you often want to reprocess the existing extract output rather than re-running the extract flow in HCM (which could cause data duplication or require additional system resources).

To handle this, we design an OIC-based solution with:

  • A Scheduler Integration to initiate the process.
  • A Main Integration to execute or reprocess the extract depending on parameters.

Solution Approach

We will create two integrations:

1. Scheduler Integration

  • Purpose: Accepts runtime parameters and decides whether to submit a new extract or reprocess an existing one.
  • Parameters:
    • EffectiveDate – Extract run date (YYYY-MM-DD format)
    • ExtractFlowInstanceName – Name of the extract flow instance to reprocess
    • SkipExtractSubmissionYes/No flag to skip submitting the extract and instead retrieve an existing output
  • Logic:
    • If SkipExtractSubmission = No → Call HCM submit extract API, wait for completion, download the file.
    • If SkipExtractSubmission = Yes → Skip submit step, directly get extract instance details, retrieve document ID, and download from UCM.



2. Main Integration

  • Purpose: Handles the extract execution, monitoring, file retrieval, and delivery to Object Storage.
  • Key Steps:
    1. Assign & Initialize Variables – Store parameters.
    2. Switch Condition – Decide if extract needs submission or reprocessing.
    3. While Loop – Poll HCM extract status until completion.
    4. Get Document ID – Retrieve from extract instance data.
    5. Download from UCM – Fetch the output file.
    6. Transform Data – Apply required mapping/format changes.
    7. Upload to Object Storage – Store file in the designated bucket.
    8. Error Handling – Throw faults if extract fails or file retrieval fails.

High-Level Flow Diagram



Benefits

  • No Duplicate Data – Avoid re-running the same extract unnecessarily.
  • Faster Recovery – Quickly reprocess failed integrations.
  • Parameter Driven – Flexible execution controlled at runtime.
  • Error Handling Built-In – Ensures issues are caught and handled.

How to download HCM extract, for details >> follow my previous blog:

https://soalicious.blogspot.com/2024/08/oic-hcm-how-to-schedule-and-download.html?m=1

Tuesday, August 12, 2025

OIC - Handling SOAP XML in a REST Trigger with Oracle Integration (OIC)

How to accept and respond with SOAP XML payloads in a REST API


Use Case

Many legacy systems still use SOAP-based XML messages for data exchange, while modern applications and integrations often rely on REST APIs.
In this scenario, we need to create an OIC REST Trigger that can:

  1. Accept a SOAP XML payload as input (request).
  2. Process the data.
  3. Return a SOAP XML response back to the caller.

This allows seamless communication between SOAP-based systems and modern RESTful endpoints without requiring the legacy system to change.


Solution Steps

1. Design the OIC Integration

  • Create a new App-Driven Orchestration in Oracle Integration.
  • Select REST as the trigger connection.

2. Configure the REST Trigger

  • Resource URL: e.g., /soapxmlhandler
  • HTTP Method: POST
  • Request Payload:
    • Set the media type to application/xml or text/xml.
    • Paste the SOAP request XSD in the request schema section.
  • Response Payload:
    • Also use application/xml or text/xml.
    • Paste the SOAP response XSD in the response schema section.





3. Import the SOAP Envelope Schema

  • Use the SOAPENV.xsd (like the 2006 OGC version in your screenshot) to define the outer SOAP structure.
  • Import your business-specific XSD (e.g., VoltageDipIncidentCustomerAccountsMessage.xsd) for the actual payload.
Add import to include xsd required.

<xs:import namespace="http://iec.ch/TC57/2011/VoltageDipIncidentCustomerAccountsMessage"
 schemaLocation="VoltageDipIncidentCustomerAccountsMessage.xsd"/>


Add request and response element from the imported xsd:

<xs:element name="Body" type="tns:Body"/>
<xs:complexType name="Body">
    <xs:sequence>
        <xs:element ref="ns1:VoltageDipIncidentCustomerAccountsResponseMessage"/>
        <xs:element ref="ns1:CreatedVoltageDipIncidentCustomerAccounts"/>
    </xs:sequence>
    <xs:anyAttribute namespace="##any" processContents="lax">
        <xs:annotation>
            <xs:documentation>
                Prose in the spec does not specify that attributes are allowed on the Body element.
            </xs:documentation>
        </xs:annotation>
    </xs:anyAttribute>
</xs:complexType>



SOAP 1.1 Specification

4. Map the Incoming SOAP Request

  • Use OIC’s mapper to extract the SOAP Body content into integration variables.
  • Process or transform as required.

5. Prepare the SOAP Response

  • Map your processed data back into the SOAP Response structure.
  • Ensure proper namespace handling (as per the SOAP schema).

6. Test the REST Endpoint

  • Use Postman or SOAP UI:
    • Send a POST request with the full SOAP XML as the body.
    • Set the Content-Type header to text/xml.
  • Verify that the response is a valid SOAP envelope.
Tested xml data:
<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/" 
            xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
            xmlns:ns1="http://iec.ch/TC57/2011/VoltageDipIncidentCustomerAccountsMessage">

    <s:Body>
        <ns1:CreatedVoltageDipIncidentCustomerAccounts>

            <!-- Header Section -->
            <Header>
                <Verb xmlns="http://iec.ch/TC57/2011/schema/message">created</Verb>
                <Noun xmlns="http://iec.ch/TC57/2011/schema/message">VoltageDipIncidentCustomerAccounts</Noun>
                <Revision xmlns="http://iec.ch/TC57/2011/schema/message">2.0</Revision>
                <Timestamp xmlns="http://iec.ch/TC57/2011/schema/message">2024-08-05T09:59:30.3759213+08:00</Timestamp>
                <Source xmlns="http://iec.ch/TC57/2011/schema/message">ABC</Source>
                <MessageID xmlns="http://iec.ch/TC57/2011/schema/message">638584487073759213</MessageID>
                <CorrelationID xmlns="http://iec.ch/TC57/2011/schema/message">638584487073759213</CorrelationID>
            </Header>

            <!-- Payload Section -->
            <Payload>
                <VoltageDipIncidentCustomerAccounts xmlns="http://iec.ch/TC57/2007/VoltageDipIncidentCustomerAccounts#">

                    <!-- Incident Record 1 -->
                    <IncidentRecord>
                        <mRID>INC1233038290</mRID>
                        <createdDateTime>2024-08-05T09:52:21+08:00</createdDateTime>
                        <CustomerAccounts>
                            <mRID>32812156411</mRID>
                        </CustomerAccounts>
                        <CustomerAccounts>
                            <mRID>32812156412</mRID>
                        </CustomerAccounts>
                    </IncidentRecord>

                    <!-- Incident Record 2 -->
                    <IncidentRecord>
                        <mRID>INC1233038291</mRID>
                        <createdDateTime>2024-08-05T08:32:25+08:00</createdDateTime>
                    </IncidentRecord>

                    <!-- Incident Record 3 -->
                    <IncidentRecord>
                        <mRID>INC1233038292</mRID>
                        <createdDateTime>2024-08-05T07:35:21+08:00</createdDateTime>
                        <CustomerAccounts>
                            <mRID>32812156412</mRID>
                        </CustomerAccounts>
                    </IncidentRecord>
               </VoltageDipIncidentCustomerAccounts>
            </Payload>
     </ns1:CreatedVoltageDipIncidentCustomerAccounts>
    </s:Body>
</s:Envelope>

7. Deployment

  • Activate the integration in OIC.
  • Share the REST endpoint URL with the consuming SOAP system.


Monday, August 4, 2025

OIC - How to Rename a File in Oracle Object Storage using Oracle Integration (OIC)

Use Case

In Oracle Cloud, Object Storage is often used as a staging area for ERP file processing, such as GL Extract, HCM Extract, or bulk data loads. However, once a file is processed successfully in ERP, it’s best practice to rename the file to avoid reprocessing or for better traceability.

For example, after a GL Extract File is loaded into ERP successfully, we want to rename it by adding a prefix such as Processed_ or appending a timestamp. This avoids confusion and maintains clear file lifecycle management.


Solution Steps

1. Object Storage REST API – renameObject

Oracle Object Storage provides a renameObject action that allows you to rename an object within a bucket without re-uploading it.

API Endpoint Format:

POST /n/{namespaceName}/b/{bucketName}/actions/renameObject

Oracle Docs:
renameObject API – Oracle Cloud Infrastructure


2. Sample Request JSON

{
  "sourceName": "SourceObjectName",
  "newName": "TargetObjectName",
  "srcObjIfMatchETag": "*",
  "newObjIfMatchETag": "*",
  "newObjIfNoneMatchETag": "*"
}
  • sourceName → Current file name in the bucket.
  • newName → New file name after rename.
  • * in ETag fields ensures no version conflicts during rename.

3. Implementation in Oracle Integration (OIC)

Step 3.1 – Configure REST Invoke

  • Name: RenameGLExtractFile
  • Method: POST
  • Relative URI:
    /n/{namespaceName}/b/{bucketName}/actions/renameObject
    
  • Enable Add and review parameters & Configure request payload options.

Step 3.2 – Create Request Mapping

From the OIC mapping canvas:

  • Map sourceName to the original filename variable (e.g., glExtFileName).
  • Map newName to the expression that generates the updated filename:
    concat(Var_PrefixTag_AddToFilename_FileUploaded, name)
    
  • Set srcObjIfMatchETag, newObjIfMatchETag, and newObjIfNoneMatchETag to "*".

Step 3.3 – Pass Template Parameters

  • bucketName → OIC variable holding the target bucket name.
  • namespaceName → OIC variable holding Object Storage namespace.

Step 3.4 – Test the Flow

Once the file is loaded successfully in ERP:

  1. Invoke the renameObject API via your configured REST connection.
  2. Verify in OCI Console → Object Storage → The file appears with the new name.

Example Scenario

  • Before Rename: GLExtract_20250804.txt
  • After Rename: Processed_GLExtract_20250804.txt

References

Screenshots:






Tuesday, July 29, 2025

OIC - Handling Mixed File Inputs in Oracle Integration (OIC): Smart Zip Check and Transfer to SFTP

Use Case

In real-world integrations, especially with ERP or external systems, files received from source applications may vary in format — some may already be zipped while others may not. To ensure consistency and avoid errors during downstream processing or transfer to systems like SFTP, we need a way to check if a file is zipped and handle it accordingly.

This use case demonstrates an Oracle Integration Cloud (OIC) integration that intelligently detects whether an incoming file is a zipped archive. If the file is already zipped, it is forwarded directly to the target SFTP server. If it's not zipped, the integration compresses it and then transfers it to the target location.


Solution Steps

  1. Scope: CheckZipFile

    • A scope that encapsulates the logic for file evaluation and error handling.
  2. Main Flow:

    • Stage File – UnzipFile:
      Tries to unzip the incoming file.
      • If successful, it means the file was zipped. No further compression is needed.
      • If it fails (i.e., the file isn't a zip), an exception is thrown.
  3. Fault Handler: Default Handler

    • Stage File – ZipFile:
      This step is triggered only when UnzipFile fails, meaning the file wasn't zipped. The step compresses the incoming file.
    • Stitch – AssignFileRef:
      Assigns or updates the file reference to point to the newly zipped version for further processing or transfer.
  4. Downstream Processing 

    • The processed (either original zipped or newly zipped) file is sent to the target SFTP or other endpoints.

Benefits

  • Flexibility: Handles both zipped and non-zipped files without requiring format enforcement at the source.
  • Error Handling: Robust fallback logic ensures no failure in case of unexpected file formats.
  • Automation Ready: Ideal for file-based B2B integrations or scheduled ERP exports.


Monday, July 28, 2025

OIC - Smart File Polling from OCI Object Storage with Dynamic Day-Based Logic and Datadog Logging

Use Case

A business requirement involves polling files from Oracle Cloud Infrastructure (OCI) Object Storage, processing them to a target file system via Agent, and logging any missing files in Datadog. The file polling count dynamically varies based on the day of the week:

  • Friday: Expect 3 files
  • Saturday: Expect 1 file
  • Other days: Can be triggered manually via an adhoc flag

The solution ensures resilience through structured error handling and JSON-driven logic that categorizes files into MissingFiles and ProcessFiles.

Flow diagam


Solution Architecture Overview

The solution is designed using 3 OIC integrations and a supporting JSON structure:

{
  "MissingFiles": [
    {
      "FileName": ""
    }
  ],
  "ProcessFiles": [
    {
      "FileName": ""
    }
  ]
}

🔁 Integration 1: Scheduler Integration

  • Purpose: Triggers the flow based on the scheduled time or ad-hoc execution.
  • Steps:
    • Runs on a schedule (typically daily).
    • Accepts a flag adhocExecution = Y/N to override weekday logic.
    • Calls the Main Integration.

🔧 Integration 2: Main File Polling Integration

  • Purpose: List and categorize files from OCI Object Storage.
  • Steps:
    1. List all files from a configured object storage bucket.
    2. Determine required file count based on:
      • Day of the week (Friday = 3, Saturday = 1).
      • adhocExecution = Y allows polling on other days.
    3. Compare expected vs actual files.
    4. Populate a JSON object:
      • ProcessFiles: Files found and ready to process.
      • MissingFiles: Files not found (expected but missing).
    5. For each ProcessFile, invoke the Child Integration.
    6. Log MissingFiles to Datadog using REST API/log collector.

Missed file - throw fault



XSlT code for validating process files vs missed files
<xsl:template match="/xml:id_11">

  <ns1grp:write xml:id="id_117">

    <ns31:request-wrapper>

      <xsl:if test="($Var_dayOfWeek = &quot;Friday&quot;)">

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_IDENTITY.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_IDENTITY.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_CREDITCARD.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_CREDITCARD.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='DLP_CA_NUM.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'DLP_CA_NUM.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

      </xsl:if>

      <xsl:if test="($Var_dayOfWeek = &quot;Saturday&quot;)">

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_NAMES.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_NAMES.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

      </xsl:if>

      <xsl:if test="($Var_dayOfWeek = &quot;Friday&quot;) or (/nsmpr0:execute/ns17:request-wrapper/ns17:ProcessRequest/ns17:AdhocExecutionFlag = &quot;Y&quot;)">

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_IDENTITY.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_IDENTITY.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_CREDITCARD.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_CREDITCARD.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='DLP_CA_NUM.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'DLP_CA_NUM.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

      </xsl:if>

      <xsl:if test="($Var_dayOfWeek = &quot;Saturday&quot;) or (/nsmpr0:execute/ns17:request-wrapper/ns17:ProcessRequest/ns17:AdhocExecutionFlag = &quot;Y&quot;)">

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_NAMES.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_NAMES.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

      </xsl:if>

    </ns31:request-wrapper>

  </ns1grp:write>

</xsl:template>  

Integration 3: Child File Processor

  • Purpose: Handles individual file transfer and cleanup.
  • Steps:
    • Download file from OCI Object Storage.
    • Write the file to a local file system via Agent.
    • Delete the file from OCI Object Storage post-processing.


Key Highlights

  • Dynamic logic using weekday and ad-hoc flags.
  • Robust processing pipeline using JSON mapping and loop controls.
  • Clean-up mechanism ensures files aren't reprocessed.
  • Monitoring integration using Datadog for transparency and alerting.


Friday, July 25, 2025

OIC - Streamlined OIC Scheduled Integration for GL Fuel Cost Data Using ESS Job Callback and Event-Based File Processing Without Timeouts

Use Case

The organization needs to automate the extraction and delivery of GL Fuel Cost data generated from a BI Publisher report in Oracle Fusion ERP. The integration should be efficient, scalable, and must avoid long-running processes and timeouts within Oracle Integration Cloud (OIC).

The requirement includes:

  • Scheduled trigger to initiate the process.
  • Submission of the BI report using the Export ESS Job Wrapper.
  • Callback-based integration to handle ESS job completion.
  • Document download, decode, unzip, and CSV parsing.
  • Transformation and delivery of the processed data to a target system.

Optimized Solution Approach (Without Timeout)

1. Scheduled Integration – Submit ESS Job Only

  • Create a Scheduled Orchestration Integration in OIC.
  • Trigger the Export ESS Job Wrapper using the ERP Cloud Adapter.
    • Service: ErpIntegrationService
    • Operation: exportBulkData
  • Pass the following information:
    •  JobName: "JobPackageName,JobDefinitionName"
    • parameterList: concat(param1,param2)
    • JobOptions:  EnableEvent= Y to ensure callback is published.
  • Do not wait for completion in the same flow — this avoids long runtime and timeouts.





2. Callback Integration – Listen to ERP Event

  • Create a separate App-Driven Integration.
  • Use ERPIntegrationOutboundEvent as the trigger.
  • Apply a filter condition on Job Name or Report Path to restrict to only GL Fuel Cost report completions.

3. Get Report Output via Document ID

  • From the event payload, extract the Document ID.
  • Call the GetDocumentForDocumentId operation using ERP Cloud Adapter.
  • Retrieve the output as a base64-encoded ZIP.




4. Decode and Unzip File

  • Use OIC's base64 decode function.
  • Use Stage File – Unzip File action to extract files from the archive.
  • Use a For-Each loop to iterate through unzipped files.

5. Parse CSV and Transform Data

  • For files with .csv extension:
    • Use Stage File – Read File to parse contents.
    • Transform the data into the required target structure using Mapper.

6. Deliver Data to Target System

  • Send the formatted data to the external target (via REST, FTP, etc.).
  • Include proper error handling, logging, and retry logic for robustness.

Benefits of This Approach

  • Avoids Timeout: Processing is split between scheduler and callback flow, preventing long execution.
  • Event-Driven: Ensures OIC listens and responds only when the ESS job completes successfully.
  • Modular Design: Easier to maintain and enhance.
  • Optimized Performance: Resource-light scheduled flow, heavier processing is deferred to callback.

Ess job sample page:



Featured Post

OIC - SAS Token Generation for Azure Event Hub REST API Authorization in OIC

📌 Use Case When calling Azure Event Hub REST APIs from Oracle Integration Cloud (OIC), authentication requires a Shared Access Signature ...