Wednesday, September 17, 2025

OIC - OIC Utility to Reprocess Failed Real-Time Integration JSON Payloads

📌 Use Case

In real-time OIC integrations, JSON payloads are exchanged with external systems via REST APIs. When such integrations fail (due to downstream errors, connectivity issues, or invalid data), the input payload is saved into an OIC SFTP error folder for recovery.

Manually retrieving, reviewing, and resubmitting these payloads is time-consuming and error-prone.

To simplify recovery, we build an OIC Utility Service that:

  • Takes Interface ID and Payload File Name as inputs
  • Fetches error folder path and target REST endpoint (relative path) from a Lookup
  • Reads the failed payload file from the error folder
  • Encodes and decodes the payload as Base64
  • Calls the dynamic downstream REST service
  • Resubmits the payload as binary JSON for seamless reprocessing

This ensures that failed real-time integrations can be reprocessed quickly and reliably, without manual intervention.


🛠️ Solution Steps

1. Create a Lookup for Metadata

Define a Lookup in OIC with mappings for each interface:

  • Interface ID → Error Folder Path → Relative REST Endpoint Path
    Example:
HCM_IFC    | /u01/error/hcm/json    | /hcm/v1/worker
Payroll_IFC| /u01/error/payroll/json| /payroll/v2/run

2. Design the Utility App-Driven Orchestration

Trigger the utility with a REST endpoint that accepts:

  • Interface ID
  • Payload File Name


3. Fetch Error Path and REST Endpoint from Lookup

  • Use Lookup functions to dynamically retrieve:
    • Error folder path
    • Relative endpoint URI

4. List Files in Error Folder

  • Use File Server (SFTP) action to list and fetch the payload file based on the provided name.
  • Capture the fileReference from the file action (pointer to the file inside OIC).

5. Read and Encode Payload

  • Use Read File (File Server Action) → Get the file content using fileReference.
  • Encode the payload into Base64, then decode back to binary JSON inside OIC.

6. Call Dynamic REST Service

  • Use HTTP Adapter with dynamic configuration:
    • Base URL → from OIC Connection
    • Relative Path → from Lookup
  • Pass the decoded JSON payload as the request body.





7. Handle Logging & Tracking

  • Log success/failure at each step (File found, Payload resubmitted, REST service status).
  • Update monitoring dashboard or custom tables for auditing.

Benefits

  • Automated reprocessing of failed JSON payloads
  • Dynamic & reusable across multiple interfaces via Lookup
  • Reduces manual errors in resubmission
  • Improves system reliability & recovery time for real-time integrations


Tuesday, September 16, 2025

OIC - Building a Utility Service in OIC to Reprocess Failed Files

📌 Use Case

In many OIC (Oracle Integration Cloud) projects, integrations involve file-based processing where files are picked from an inbound location and processed further.

However, some files may fail due to validation issues, network errors, or downstream service unavailability. Typically, failed files are moved into an error folder.

Instead of manually moving and reprocessing files, we can create a reusable utility App-Driven Orchestration in OIC that:

  • Takes input parameters: Interface ID, File Name, and File Processing Date
  • Identifies inbound and error folders from a lookup using the Interface ID
  • Lists all failed files in the error folder
  • Moves files back to the inbound folder
  • Calls the next integration service dynamically (via absolute endpoint URI)
  • Passes the required parameters to retry the file processing automatically

This makes reprocessing automated, consistent, and faster.


🛠️ Solution Steps

1. Create Lookup for Folder Paths

  • Define a Lookup in OIC with mappings:
    • Interface ID → Inbound Folder Path → Error Folder Path
  • Example:
    Payroll_IFC | /u01/inbound/payroll | /u01/error/payroll
    HCM_IFC     | /u01/inbound/hcm     | /u01/error/hcm
    

2. Design the App-Driven Orchestration (Utility Service)

  • Triggered by a REST endpoint that takes:
    • Interface ID
    • File Name
    • File Processing Date


3. Fetch Folder Paths from Lookup

  • Use OIC Lookup functions to fetch Error Folder and Inbound Folder based on the provided Interface ID.

4. List Files in Error Folder

  • Call FTP / File Adapter to list files from the error folder.
  • Apply filter by File Name + Date if provided.

5. Move Files from Error → Inbound

  • For each matching file:
    • Use File Adapter (Read/Write) or FTP Move operation
    • Move file from the error folder to the inbound folder

6. Call the Next Integration

  • Configure an HTTP Adapter to call the downstream OIC service (absolute endpoint).
  • Pass parameters like:
    • File Name
    • Processing Date
    • Interface ID
  • This re-triggers the main integration flow as if the file was newly dropped in inbound.


7. Handle Errors and Logging

  • Add tracking:
    • Success → File reprocessed successfully
    • Failure → Log reason (e.g., file not found, service unavailable)
  • Store logs in OIC Activity Stream or custom log file.

✅ Benefits

  • Fully automated reprocessing of failed files
  • No manual intervention needed
  • Reusable utility → works across multiple integrations
  • Lookup-driven → easy to extend for new interfaces


OIC - Create a large test file by exponential doubling using Windows batch or command

 Use case

You need a large text file quickly for testing integrations, stress-testing parsers, load tests, or demoing file-handling logic. The simplest approach on Windows is to start with a single line and repeatedly concatenate the file with itself — this doubles the line count each pass and reaches large sizes fast (exponential growth).


Explanation:

This batch script creates an output file with a single sample line and then doubles its size repeatedly by concatenating the file into a temporary file and back. Because every iteration multiplies the line count by two, you quickly reach tens or hundreds of thousands of lines with only a small number of iterations. The example below runs 17 doubling passes, producing 2^17 = 131,072 lines.


Code (copy-paste into a .bat file or .cmd file)

@echo off
setlocal EnableDelayedExpansion

:: Line content
set "line=This is the sample line"

:: Output file (adjust path if needed)
set "outfile=%USERPROFILE%\Desktop\output.txt"
if exist "%outfile%" del "%outfile%"

:: Start with 1 line
echo %line%>"%outfile%"

:: Double the file until it reaches ~100,000+ lines
for %%i in (1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17) do (
  type "%outfile%" >> "%outfile%.tmp"
  type "%outfile%.tmp" >> "%outfile%"
  del "%outfile%.tmp"
)

echo Done!
pause


Step-by-step solution / how it works

  1. Set up environment

    • @echo off hides command echoing.
    • setlocal EnableDelayedExpansion enables advanced variable handling (safe practice for scripts that modify variables in loops).
  2. Define the sample line

    • set "line=This is the sample line" — change the text inside quotes to whatever content you want repeated.
  3. Define and clear output file

    • set "outfile=%USERPROFILE%\Desktop\output.txt" places file on Desktop; change path as required.
    • if exist "%outfile%" del "%outfile%" removes any previous file with the same name.
  4. Seed file

    • echo %line%>"%outfile%" creates the file with a single line to start.
  5. Double the file repeatedly

    • The for %%i in (...) do (...) loop runs 17 times (you can change the number to control final size).
    • Inside loop:
      • type "%outfile%" >> "%outfile%.tmp" writes the current file content into a temp file (one copy).
      • type "%outfile%.tmp" >> "%outfile%" appends that temp file back to the original — now the original contains the old content plus the appended copy → doubled size.
      • del "%outfile%.tmp" removes the temp file.
  6. Finish

    • echo Done! informs completion and pause keeps the console open to view the message (press any key to exit).

How to choose number of iterations

  • Start with 1 line. Each iteration doubles the line count: after n iterations you have 2^n lines.
    • 10 iterations → 1,024 lines
    • 16 iterations → 65,536 lines
    • 17 iterations → 131,072 lines (the sample script)
  • If you want a specific line count, choose n = ceil(log2(desired_lines)).

Tips & cautions

  • Disk space & memory: big files can consume significant disk space and may be slow on slow disks. Use caution on low-storage systems.
  • Encoding: echo/type produce ANSI encoding by default. If you need UTF-8, consider using PowerShell (example below) or ensure callers handle ANSI.
  • Permissions: run in a location where you have write access (Desktop is safe for user-run scripts).
  • Performance: doubling is fast but each type reads and writes the whole file — for extremely large sizes, consider streaming approaches or generating lines programmatically.
  • Cleanup: remember to delete the generated file after tests.

Thursday, September 11, 2025

OIC - Generating SAS Token for Azure Hub Access in OIC Using Built-in Functions, not using crypto library

📌 Use Case

When integrating Oracle Integration Cloud (OIC) with Azure Event Hub / Service Bus / IoT Hub, authentication requires a Shared Access Signature (SAS) token.

  • This token is generated from:
    • Resource URI (sr)
    • Expiry time (se)
    • Shared Access Key Name (skn)
    • Shared Access Key (saKey)
  • The signature (sig) must be an HMAC-SHA256 hash of the resource URI and expiry, encoded in Base64 and URL-safe.

Instead of relying on external crypto libraries, we can leverage OIC’s built-in oic.crypto.hmacsha256 function to securely generate this SAS token inside integration code.


🛠 Solution Steps

1. Define Hex → Base64 URL-safe Converter

The Azure signature must be Base64 URL-encoded. In OIC JS functions, the HMAC result is hex, so we first convert it:

function hexToBase64UrlEncoded(hexString) {
  // Convert hex to byte array
  var bytes = [];
  for (var i = 0; i < hexString.length; i += 2) {
    bytes.push(parseInt(hexString.substr(i, 2), 16));
  }

  // Base64 character set
  var base64Chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/';
  var base64 = '';
  var padding = '=';

  // Process every 3 bytes into 4 base64 characters
  for (var i = 0; i < bytes.length; i += 3) {
    var byte1 = bytes[i];
    var byte2 = i + 1 < bytes.length ? bytes[i + 1] : 0;
    var byte3 = i + 2 < bytes.length ? bytes[i + 2] : 0;

    var triplet = (byte1 << 16) | (byte2 << 8) | byte3;

    base64 += base64Chars[(triplet >> 18) & 0x3F];
    base64 += base64Chars[(triplet >> 12) & 0x3F];
    base64 += i + 1 < bytes.length ? base64Chars[(triplet >> 6) & 0x3F] : '=';
    base64 += i + 2 < bytes.length ? base64Chars[triplet & 0x3F] : '=';
  }

  // URL-encode the Base64 string
  return encodeURIComponent(base64);
}

2. Generate SAS Token in OIC Build Function

This function assembles the SAS token using OIC’s built-in crypto support:

function GetAzureHubAccessTokenOIC(uri, saName, saKey) {
  if (!uri || !saName || !saKey) {
    throw new Error("Missing required parameter");
  }

  var encoded = encodeURIComponent(uri);
  var now = new Date();

  // Token validity: 1 week
  var week = 60 * 60 * 24 * 7; // in seconds
  var ttl = Math.round(now.getTime() / 1000) + week;

  // String to sign
  var signature = encoded + '\n' + ttl;

  // HMAC-SHA256 using OIC built-in function
  var hashCode_value = oic.crypto.hmacsha256(signature, saKey);

  // SAS Token format
  var sasToken =
    "SharedAccessSignature sr=" + encoded +
    "&sig=" + hexToBase64UrlEncoded(hashCode_value) +
    "&se=" + ttl +
    "&skn=" + saName;

  return sasToken;
}
Code screenshot:


3. Output SAS Token

The function returns a SAS token like:

SharedAccessSignature sr=<resource-uri>
&sig=<signature>
&se=<expiry-timestamp>
&skn=<key-name>

Example:

SharedAccessSignature sr=https%3A%2F%2Fmyeventhubs.servicebus.windows.net%2Fsamplehub
&sig=abcdXYZ123%3D
&se=1726221440
&skn=RootManageSharedAccessKey

Key Takeaways

  • No external crypto library is required — OIC’s built-in oic.crypto.hmacsha256 handles signing.
  • hexToBase64UrlEncoded() ensures the signature is in the correct Base64 URL-safe format.
  • The generated SAS token can be directly used in HTTP headers for Azure Event Hub or Service Bus REST APIs.


Tuesday, September 9, 2025

OIC - How to Extend a Map in an Oracle Integration Cloud (OIC) Accelerator Project

Empowering customizations while preserving upgrade compatibility in OIC accelerator integrations


Use Case: When and Why to Extend a Map

Imagine your OIC accelerator integration was originally designed to sync Employee Data between your HR system and a downstream service. The default mapping includes fields like EmployeeID, FirstName, and LastName. Now, your business needs to include Department Code and Work Location in the mapping, which are not in the original integration flow.

To meet this requirement without modifying the original accelerator, and to keep your extension safe across future updates, you should extend the integration through an Extension Group—Oracle’s recommended, upgrade-safe mechanism for customizations.


Solution Steps: How to Do It (Accurately Guided by Oracle's Documentation)

1. Open the Accelerator Project and Select the Integration

  • In the OIC Projects pane, locate your accelerator project (marked with "Accelerator" and "Oracle") .
  • Navigate to the integration you wish to modify.
  • From the Actions menu, choose Extend—this will initiate the extension mode (note: this option is exclusive to accelerator projects) .

2. Insert an Extension Group at the Appropriate Point

  • Locate the step in the integration flow where you want the extended mapping to take place (e.g., before or after an existing Map or Invoke action).
  • Click the Add (+) icon at that point and select Extension Group, or choose Extend before / Extend after using the Actions menu inside the relevant action block .

3. Add the Map Action to the Extension Group

  • Inside the newly created Extension Group, click the Add icon or Integration actions menu and choose Map to create an ad-hoc mapping action .
  • The OIC mapper will open; drag and drop your new fields from source (e.g., DepartmentCode, WorkLocation) to the corresponding targets, applying any transformations as needed.
Notes for Extending a Map in OIC

1. Drag and Map Elements to get new namespace prefix
Select the required elements from the source and map them to the corresponding target fields.

2. Custom DVM Mappings to get new namespace prefix
Use a Domain Value Map (DVM) wherever code-to-description or lookup translations are required as custom dvm, namespace can cause issue in extended map.

3. Verify and add Namespaces and Prefixes to initial xslt map
Check if new namespaces or prefixes are introduced in the extended schema and ensure they are consistent.

4. Update XSLT in Notepad++
Copy the initial map’s XSLT code into Notepad++. Update namespace prefixes if needed and replace the templete to templete whole code in the extended map to ensure proper references.

5. Add New Elements
Introduce any additional elements required by the business use case and map them appropriately.

4. Additional Customization (Optional But Powerful)

You can further enhance your extension with other actions, such as:

  • Data Stitch: To merge multiple payloads or variables.
  • For-Each: To process repeated elements.
  • Switch: To implement conditional routing within your flow.
  • Integration Invoke: To call child integrations.
  • Global Variables, Lookups, or JavaScript Libraries: For reusable variables, code translation/lookup logic, or custom script-based logic.

5. Save, Validate, and Activate

  • Once your extended mapping (and any extra actions) are configured, save the integration.
  • Validate the mapping; OIC will surface any missing or invalid connections.
  • Finally, activate the integration to apply your extension.

6. Preserve Your Extensions During Upgrades

One of the biggest advantages of using an Extension Group is that your customizations remain intact during future accelerator upgrades:

  • When a new version of the accelerator becomes available, you can choose to Merge latest extensions during the installation. Oracle will automatically apply your Extension Group customizations to the upgraded version.
  • Alternatively, if you skip automatic merging, you can still manually merge your extensions into the new version later.

Blog Summary: Why Use Extension Groups for Map Customization

Benefit Description
Upgrade-safe Your custom mapping stays intact during accelerator updates.
Structured customization Extensions are isolated in their own group—easy to manage and modify.
Flexible extensibility Besides maps, you can add loops, lookups, global vars, child integrations, and more.
Low-impact No changes to the original accelerator code, minimizing risks.


Thursday, September 4, 2025

OIC - SAS Token Generation for Azure Event Hub REST API Authorization in OIC

📌 Use Case

When calling Azure Event Hub REST APIs from Oracle Integration Cloud (OIC), authentication requires a Shared Access Signature (SAS) token.

  • SAS token = authorization key generated using HMAC SHA256 hashing.
  • OIC does not provide native functions to generate this.
  • Solution → Create a custom JavaScript library to build SAS token dynamically and inject it into OIC REST calls.

SAS key vs SAS Token:

An SAS Key (Shared Access Signature Key) is a security credential used in Microsoft Azure to grant limited, time-bound access to resources like Event Hubs, Blob Storage, Queues, Service Bus, etc.

🔑 How it works:

  • When you create an Azure resource (like an Event Hub namespace), Azure generates Access Keys for it.
  • These are usually two keys: Primary Key and Secondary Key.
  • Using one of these keys, you (or your code) can generate an SAS Token.
  • The SAS Token contains:
    • Resource URI (what you want to access)
    • Expiry time (when the token becomes invalid)
    • Signature (HMAC-SHA256 signed using the SAS Key)

👉 The SAS Key is the secret you store securely, and from it you generate SAS Tokens that your app or OIC flow uses in the Authorization header.

⚠️ Important:

  • Never expose the SAS Key directly in your apps or clients.
  • Always generate SAS Tokens from it and use those instead.
Solution - FLow diagram:


⚙️ Solution Steps

1. Build the OIC Custom Library

  • Download CryptoJS v3.1.2 from GitHub → CryptoJS v3.1.2.
  • Copy content from:
    • rollups/hmac-sha256.js
    • components/enc-base64.js
  • Append the following function at the bottom:
function createSharedAccessTokenUpd(uri, saName, saKey) {
    if (!uri || !saName || !saKey) {
        throw new Error("Missing required parameter");
    }

    var encoded = encodeURIComponent(uri);
    var now = new Date();
    var week = 60 * 60 * 24 * 7; // 1 week in seconds
    var ttl = Math.round(now.getTime() / 1000) + week;
    var signature = encoded + '\n' + ttl;

    var hash = CryptoJS.HmacSHA256(signature, saKey);
    var hashInBase64 = CryptoJS.enc.Base64.stringify(hash);

    var sasToken = "SharedAccessSignature sr=" + encoded +
                   "&sig=" + encodeURIComponent(hashInBase64) +
                   "&se=" + ttl +
                   "&skn=" + saName;

    return sasToken;
}
  • Save file as OICHmacSHA256.js.
  • Upload it into OIC Libraries.



2. Generate SAS Token inside OIC Mapping

  • In the integration, open your mapping canvas for the REST call.
  • For the Authorization header, call the custom JS function:
createSharedAccessTokenUpd(
  "https://<your-namespace>.servicebus.windows.net",
  "DefaultFullSharedAccessSignature",
  "<your-shared-access-key>"
)

Here, the function dynamically generates the SAS token and places it in the Authorization header.


3. Mapping in the Integration XML

In the integration file, OIC internally translates this mapping into XSLT/XML. Example:

<ns25:StandardHttpHeaders>
  <ns25:Authorization>
    <xsl:value-of select="ora:js: createSharedAccessTokenUpd(
      &quot;https://test-ns.servicebus.windows.net&quot;,
      &quot;DefaultFullSharedAccessSignature&quot;,
      &quot;wjoLBJ...= &quot; )"/>
  </ns25:Authorization>
</ns25:StandardHttpHeaders>

This ensures every REST call to Azure Event Hub uses the correct SAS token dynamically generated at runtime.


Summary

  • Built a custom OIC library with CryptoJS HMAC-SHA256 + Base64.
  • Added SAS token generator function (createSharedAccessTokenUpd).
  • Called the function in OIC mapping → populated the Authorization header.
  • Verified via XSLT/XML that the SAS token gets injected into the REST API call.

This approach ensures secure and reusable Azure Event Hub connectivity from OIC.



Wednesday, September 3, 2025

XSD - Difference Between ref and type in XSD

Using type (local definition)

Here, we define elements directly with their datatypes.

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
           targetNamespace="http://example.com/auth"
           xmlns="http://example.com/auth"
           elementFormDefault="qualified">

  <!-- Local elements -->
  <xs:element name="UserName" type="xs:string"/>
  <xs:element name="Password" type="xs:string"/>

</xs:schema>

XML Output (local namespace elements):

<auth:UserName xmlns:auth="http://example.com/auth">usr</auth:UserName>
<auth:Password xmlns:auth="http://example.com/auth">pwd</auth:Password>

Using ref (reusing global elements)

First, define global reusable elements in a common schema.

common.xsd

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
           targetNamespace="http://example.com/common"
           xmlns="http://example.com/common"
           elementFormDefault="qualified">

  <!-- Global elements -->
  <xs:element name="MessageId" type="xs:string"/>
  <xs:element name="Timestamp" type="xs:string"/>

</xs:schema>

Now, reference those in the service schema.

service.xsd

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
           targetNamespace="http://example.com/service"
           xmlns:svc="http://example.com/service"
           xmlns:common="http://example.com/common"
           elementFormDefault="qualified">

  <!-- Import the common schema -->
  <xs:import namespace="http://example.com/common" schemaLocation="common.xsd"/>

  <!-- Reference global elements -->
  <xs:element ref="common:MessageId"/>
  <xs:element ref="common:Timestamp"/>

</xs:schema>

XML Output (reused common namespace elements):

<common:MessageId xmlns:common="http://example.com/common">12345</common:MessageId>
<common:Timestamp xmlns:common="http://example.com/common">2025-09-03T10:00:00</common:Timestamp>

3. Combined SOAP Example

Both styles used together in a real request:

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"
                  xmlns:auth="http://example.com/auth"
                  xmlns:common="http://example.com/common">
   <soapenv:Header/>
   <soapenv:Body>
      <auth:LoginRequest>
         <!-- Local elements (defined with type) -->
         <auth:UserName>usr</auth:UserName>
         <auth:Password>pwd</auth:Password>

         <!-- Global reusable elements (referenced with ref) -->
         <common:MessageId>12345</common:MessageId>
         <common:Timestamp>2025-09-03T10:00:00</common:Timestamp>
      </auth:LoginRequest>
   </soapenv:Body>
</soapenv:Envelope>

✅ Summary

  • type → Local definition, used for service-specific fields (UserName, Password).
  • ref → Reference to global elements, used for shared fields (MessageId, Timestamp).

This way you can mix both:

  • Use type for fields unique to a service.
  • Use ref for fields that must come from a common namespace.



OIC - Handling Multiple Namespaces in SOAP Payloads in Oracle Integration Cloud (OIC

Handling Multiple Namespaces in SOAP Payloads in Oracle Integration Cloud (OIC)

Use Case

When integrating with SOAP-based APIs in OIC, the payload sometimes requires elements from different namespaces within the same request.
For example, a SOAP login request may contain:

  • MessageId, ReplyAddress, and Timestamp (from a Common namespace)
  • UserName and Password (from an API-specific namespace)

If the WSDL does not define these elements properly, OIC generates an incorrect SOAP request, causing deserialization errors such as:

CASDK-0033: Received a SOAP fault while invoking endpoint target...
The formatter threw an exception while trying to deserialize the message:
'Element 'UserName' from namespace ... is not expected.
Expecting element 'MessageId'.

Problem

The SOAP request requires two different namespaces for different elements:

  • Namespace A → MessageId, ReplyAddress, Timestamp
  • Namespace B → UserName, Password

If the WSDL only defines one namespace, OIC incorrectly generates the payload, mixing up the expected prefixes.


Solution Steps

Step 1: Define a Schema for MessageId

Create a schema (.xsd) for the Common namespace elements:

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
           targetNamespace="http://www.mv90xiApi.com/Common/2022/5"
           elementFormDefault="qualified">

  <xs:element name="MessageId" type="xs:string"/>
  <xs:element name="ReplyAddress" type="xs:string"/>
  <xs:element name="Timestamp" type="xs:string"/>

</xs:schema>

Step 2: Import Schema into WSDL

In your WSDL, import the schema so OIC can resolve the namespace correctly:

<xs:import namespace="http://www.mv90xiApi.com/Common/2022/5"/>

Then, instead of defining MessageId inline, reference it:

<xs:element ref="common:MessageId"/>

Step 3: Fix UserName and Password Namespace

Update the WSDL for the AuthRequestMessage.
Originally it might look like this (incorrect namespace usage):

<xs:complexType name="AuthRequestMessage">
  <xs:complexContent mixed="false">
    <xs:extension base="tns:RequestMessage" xmlns="http://www.mv90xiApi.com/Common/2022/5">
      <xs:sequence>
        <xs:element name="UserName" nillable="true" type="xs:string"/>
        <xs:element name="Password" nillable="true" type="xs:string"/>
      </xs:sequence>
    </xs:extension>
  </xs:complexContent>
</xs:complexType>

Change it to explicitly use the API namespace:

<xs:complexType name="AuthRequestMessage">
  <xs:complexContent mixed="false">
    <xs:extension base="tns:RequestMessage" xmlns="http://www.mv90xiApi.com/api/2022/5">
      <xs:sequence>
        <xs:element name="UserName" nillable="true" type="xs:string"/>
        <xs:element name="Password" nillable="true" type="xs:string"/>
      </xs:sequence>
    </xs:extension>
  </xs:complexContent>
</xs:complexType>

Step 4: Validate Final Payload

After modifying the WSDL, OIC generates the correct SOAP request with both namespaces:

<soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope"
               xmlns:ns="http://www.mv90xiApi.com/api/2022/5"
               xmlns:ns1="http://www.mv90xiApi.com/Common/2022/5">
  <soap:Header/>
  <soap:Body>
    <ns:Login>
      <ns:message>
        <ns1:MessageId>12345</ns1:MessageId>
        <ns1:ReplyAddress>http://reply.com</ns1:ReplyAddress>
        <ns1:Timestamp>2025-09-03T10:00:00</ns1:Timestamp>
        <ns:UserName>user</ns:UserName>
        <ns:Password>pwd</ns:Password>
      </ns:message>
    </ns:Login>
  </soap:Body>
</soap:Envelope>

Benefits of This Approach

  • Resolves namespace conflicts between common elements and authentication fields.
  • Ensures SOAP payload matches the service contract.
  • Avoids CASDK-0033 deserialization errors in OIC.

👉 Reference: Oracle Docs – Working with SOAP Integrations



Friday, August 22, 2025

OIC - Sending JSON Data as Email Body Content via Twilio SendGrid API in OIC

Use Case Description:

When integrating Oracle Integration Cloud (OIC) with Twilio SendGrid API to send emails, JSON data intended for the email body is sometimes sent as an attachment instead of inline content. This issue occurs because the API interprets the payload as attachment content, especially if the content type isn't correctly specified. The goal is to send JSON data directly within the email body using the string() function and setting the Content-Type as text/plain or application/json.

Solution Steps:
1. Serialize JSON Data to String
Use the string() function in OIC to convert your JSON object into a string format suitable for embedding in the email body.
2. Structure the Email Payload
Prepare the payload adhering to SendGrid's API format:Set "content" with "type": "text/plain" or "application/json".
Include the serialized JSON string in the "value" field.
Example payload snippet:
json
Copy
{ "personalizations": [ { "to": [{"email": "recipient@example.com"}], "subject": "JSON Data Email" } ], "from": {"email": "sender@example.com"}, "content": [ { "type": "text/plain", // or "application/json" based on requirement "value": "{your JSON string here}" } ] }
3. Setup API Headers
Ensure the HTTP headers include:
Content-Type: application/json
4. Make the API Call in OIC
Use an HTTP action to POST the above payload to SendGrid's API endpoint (https://api.sendgrid.com/v3/mail/send).
Pass the API key in Authorization headers.
5. Validate the Email Content
Check the received email to confirm that JSON data appears inline in the email body, not as an attachment.
Summary:
By serializing JSON data with string(), structuring the payload correctly, and setting the content type appropriately, you can send JSON data directly as the email body in Twilio SendGrid API through OIC, avoiding it being treated as an attachment.

Wednesday, August 13, 2025

OIC - How to Reprocess an HCM Extract in OIC Without Resubmitting the Flow

Use Case

In Oracle HCM integrations, it’s common to schedule extracts and process their output to a target destination like Oracle Object Storage. However, in real-world scenarios, the extract may fail in the middle of the process — for example, due to downstream errors — even though the extract itself completed successfully in HCM.

When this happens, you often want to reprocess the existing extract output rather than re-running the extract flow in HCM (which could cause data duplication or require additional system resources).

To handle this, we design an OIC-based solution with:

  • A Scheduler Integration to initiate the process.
  • A Main Integration to execute or reprocess the extract depending on parameters.

Solution Approach

We will create two integrations:

1. Scheduler Integration

  • Purpose: Accepts runtime parameters and decides whether to submit a new extract or reprocess an existing one.
  • Parameters:
    • EffectiveDate – Extract run date (YYYY-MM-DD format)
    • ExtractFlowInstanceName – Name of the extract flow instance to reprocess
    • SkipExtractSubmissionYes/No flag to skip submitting the extract and instead retrieve an existing output
  • Logic:
    • If SkipExtractSubmission = No → Call HCM submit extract API, wait for completion, download the file.
    • If SkipExtractSubmission = Yes → Skip submit step, directly get extract instance details, retrieve document ID, and download from UCM.



2. Main Integration

  • Purpose: Handles the extract execution, monitoring, file retrieval, and delivery to Object Storage.
  • Key Steps:
    1. Assign & Initialize Variables – Store parameters.
    2. Switch Condition – Decide if extract needs submission or reprocessing.
    3. While Loop – Poll HCM extract status until completion.
    4. Get Document ID – Retrieve from extract instance data.
    5. Download from UCM – Fetch the output file.
    6. Transform Data – Apply required mapping/format changes.
    7. Upload to Object Storage – Store file in the designated bucket.
    8. Error Handling – Throw faults if extract fails or file retrieval fails.

High-Level Flow Diagram



Benefits

  • No Duplicate Data – Avoid re-running the same extract unnecessarily.
  • Faster Recovery – Quickly reprocess failed integrations.
  • Parameter Driven – Flexible execution controlled at runtime.
  • Error Handling Built-In – Ensures issues are caught and handled.

How to download HCM extract, for details >> follow my previous blog:

https://soalicious.blogspot.com/2024/08/oic-hcm-how-to-schedule-and-download.html?m=1

Tuesday, August 12, 2025

OIC - Handling SOAP XML in a REST Trigger with Oracle Integration (OIC)

How to accept and respond with SOAP XML payloads in a REST API


Use Case

Many legacy systems still use SOAP-based XML messages for data exchange, while modern applications and integrations often rely on REST APIs.
In this scenario, we need to create an OIC REST Trigger that can:

  1. Accept a SOAP XML payload as input (request).
  2. Process the data.
  3. Return a SOAP XML response back to the caller.

This allows seamless communication between SOAP-based systems and modern RESTful endpoints without requiring the legacy system to change.


Solution Steps

1. Design the OIC Integration

  • Create a new App-Driven Orchestration in Oracle Integration.
  • Select REST as the trigger connection.

2. Configure the REST Trigger

  • Resource URL: e.g., /soapxmlhandler
  • HTTP Method: POST
  • Request Payload:
    • Set the media type to application/xml or text/xml.
    • Paste the SOAP request XSD in the request schema section.
  • Response Payload:
    • Also use application/xml or text/xml.
    • Paste the SOAP response XSD in the response schema section.





3. Import the SOAP Envelope Schema

  • Use the SOAPENV.xsd (like the 2006 OGC version in your screenshot) to define the outer SOAP structure.
  • Import your business-specific XSD (e.g., VoltageDipIncidentCustomerAccountsMessage.xsd) for the actual payload.
Add import to include xsd required.

<xs:import namespace="http://iec.ch/TC57/2011/VoltageDipIncidentCustomerAccountsMessage"
 schemaLocation="VoltageDipIncidentCustomerAccountsMessage.xsd"/>


Add request and response element from the imported xsd:

<xs:element name="Body" type="tns:Body"/>
<xs:complexType name="Body">
    <xs:sequence>
        <xs:element ref="ns1:VoltageDipIncidentCustomerAccountsResponseMessage"/>
        <xs:element ref="ns1:CreatedVoltageDipIncidentCustomerAccounts"/>
    </xs:sequence>
    <xs:anyAttribute namespace="##any" processContents="lax">
        <xs:annotation>
            <xs:documentation>
                Prose in the spec does not specify that attributes are allowed on the Body element.
            </xs:documentation>
        </xs:annotation>
    </xs:anyAttribute>
</xs:complexType>



SOAP 1.1 Specification

4. Map the Incoming SOAP Request

  • Use OIC’s mapper to extract the SOAP Body content into integration variables.
  • Process or transform as required.

5. Prepare the SOAP Response

  • Map your processed data back into the SOAP Response structure.
  • Ensure proper namespace handling (as per the SOAP schema).

6. Test the REST Endpoint

  • Use Postman or SOAP UI:
    • Send a POST request with the full SOAP XML as the body.
    • Set the Content-Type header to text/xml.
  • Verify that the response is a valid SOAP envelope.
Tested xml data:
<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/" 
            xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
            xmlns:ns1="http://iec.ch/TC57/2011/VoltageDipIncidentCustomerAccountsMessage">

    <s:Body>
        <ns1:CreatedVoltageDipIncidentCustomerAccounts>

            <!-- Header Section -->
            <Header>
                <Verb xmlns="http://iec.ch/TC57/2011/schema/message">created</Verb>
                <Noun xmlns="http://iec.ch/TC57/2011/schema/message">VoltageDipIncidentCustomerAccounts</Noun>
                <Revision xmlns="http://iec.ch/TC57/2011/schema/message">2.0</Revision>
                <Timestamp xmlns="http://iec.ch/TC57/2011/schema/message">2024-08-05T09:59:30.3759213+08:00</Timestamp>
                <Source xmlns="http://iec.ch/TC57/2011/schema/message">ABC</Source>
                <MessageID xmlns="http://iec.ch/TC57/2011/schema/message">638584487073759213</MessageID>
                <CorrelationID xmlns="http://iec.ch/TC57/2011/schema/message">638584487073759213</CorrelationID>
            </Header>

            <!-- Payload Section -->
            <Payload>
                <VoltageDipIncidentCustomerAccounts xmlns="http://iec.ch/TC57/2007/VoltageDipIncidentCustomerAccounts#">

                    <!-- Incident Record 1 -->
                    <IncidentRecord>
                        <mRID>INC1233038290</mRID>
                        <createdDateTime>2024-08-05T09:52:21+08:00</createdDateTime>
                        <CustomerAccounts>
                            <mRID>32812156411</mRID>
                        </CustomerAccounts>
                        <CustomerAccounts>
                            <mRID>32812156412</mRID>
                        </CustomerAccounts>
                    </IncidentRecord>

                    <!-- Incident Record 2 -->
                    <IncidentRecord>
                        <mRID>INC1233038291</mRID>
                        <createdDateTime>2024-08-05T08:32:25+08:00</createdDateTime>
                    </IncidentRecord>

                    <!-- Incident Record 3 -->
                    <IncidentRecord>
                        <mRID>INC1233038292</mRID>
                        <createdDateTime>2024-08-05T07:35:21+08:00</createdDateTime>
                        <CustomerAccounts>
                            <mRID>32812156412</mRID>
                        </CustomerAccounts>
                    </IncidentRecord>
               </VoltageDipIncidentCustomerAccounts>
            </Payload>
     </ns1:CreatedVoltageDipIncidentCustomerAccounts>
    </s:Body>
</s:Envelope>

7. Deployment

  • Activate the integration in OIC.
  • Share the REST endpoint URL with the consuming SOAP system.


Monday, August 4, 2025

OIC - How to Rename a File in Oracle Object Storage using Oracle Integration (OIC)

Use Case

In Oracle Cloud, Object Storage is often used as a staging area for ERP file processing, such as GL Extract, HCM Extract, or bulk data loads. However, once a file is processed successfully in ERP, it’s best practice to rename the file to avoid reprocessing or for better traceability.

For example, after a GL Extract File is loaded into ERP successfully, we want to rename it by adding a prefix such as Processed_ or appending a timestamp. This avoids confusion and maintains clear file lifecycle management.


Solution Steps

1. Object Storage REST API – renameObject

Oracle Object Storage provides a renameObject action that allows you to rename an object within a bucket without re-uploading it.

API Endpoint Format:

POST /n/{namespaceName}/b/{bucketName}/actions/renameObject

Oracle Docs:
renameObject API – Oracle Cloud Infrastructure


2. Sample Request JSON

{
  "sourceName": "SourceObjectName",
  "newName": "TargetObjectName",
  "srcObjIfMatchETag": "*",
  "newObjIfMatchETag": "*",
  "newObjIfNoneMatchETag": "*"
}
  • sourceName → Current file name in the bucket.
  • newName → New file name after rename.
  • * in ETag fields ensures no version conflicts during rename.

3. Implementation in Oracle Integration (OIC)

Step 3.1 – Configure REST Invoke

  • Name: RenameGLExtractFile
  • Method: POST
  • Relative URI:
    /n/{namespaceName}/b/{bucketName}/actions/renameObject
    
  • Enable Add and review parameters & Configure request payload options.

Step 3.2 – Create Request Mapping

From the OIC mapping canvas:

  • Map sourceName to the original filename variable (e.g., glExtFileName).
  • Map newName to the expression that generates the updated filename:
    concat(Var_PrefixTag_AddToFilename_FileUploaded, name)
    
  • Set srcObjIfMatchETag, newObjIfMatchETag, and newObjIfNoneMatchETag to "*".

Step 3.3 – Pass Template Parameters

  • bucketName → OIC variable holding the target bucket name.
  • namespaceName → OIC variable holding Object Storage namespace.

Step 3.4 – Test the Flow

Once the file is loaded successfully in ERP:

  1. Invoke the renameObject API via your configured REST connection.
  2. Verify in OCI Console → Object Storage → The file appears with the new name.

Example Scenario

  • Before Rename: GLExtract_20250804.txt
  • After Rename: Processed_GLExtract_20250804.txt

References

Screenshots:






Tuesday, July 29, 2025

OIC - Handling Mixed File Inputs in Oracle Integration (OIC): Smart Zip Check and Transfer to SFTP

Use Case

In real-world integrations, especially with ERP or external systems, files received from source applications may vary in format — some may already be zipped while others may not. To ensure consistency and avoid errors during downstream processing or transfer to systems like SFTP, we need a way to check if a file is zipped and handle it accordingly.

This use case demonstrates an Oracle Integration Cloud (OIC) integration that intelligently detects whether an incoming file is a zipped archive. If the file is already zipped, it is forwarded directly to the target SFTP server. If it's not zipped, the integration compresses it and then transfers it to the target location.


Solution Steps

  1. Scope: CheckZipFile

    • A scope that encapsulates the logic for file evaluation and error handling.
  2. Main Flow:

    • Stage File – UnzipFile:
      Tries to unzip the incoming file.
      • If successful, it means the file was zipped. No further compression is needed.
      • If it fails (i.e., the file isn't a zip), an exception is thrown.
  3. Fault Handler: Default Handler

    • Stage File – ZipFile:
      This step is triggered only when UnzipFile fails, meaning the file wasn't zipped. The step compresses the incoming file.
    • Stitch – AssignFileRef:
      Assigns or updates the file reference to point to the newly zipped version for further processing or transfer.
  4. Downstream Processing 

    • The processed (either original zipped or newly zipped) file is sent to the target SFTP or other endpoints.

Benefits

  • Flexibility: Handles both zipped and non-zipped files without requiring format enforcement at the source.
  • Error Handling: Robust fallback logic ensures no failure in case of unexpected file formats.
  • Automation Ready: Ideal for file-based B2B integrations or scheduled ERP exports.


Monday, July 28, 2025

OIC - Smart File Polling from OCI Object Storage with Dynamic Day-Based Logic and Datadog Logging

Use Case

A business requirement involves polling files from Oracle Cloud Infrastructure (OCI) Object Storage, processing them to a target file system via Agent, and logging any missing files in Datadog. The file polling count dynamically varies based on the day of the week:

  • Friday: Expect 3 files
  • Saturday: Expect 1 file
  • Other days: Can be triggered manually via an adhoc flag

The solution ensures resilience through structured error handling and JSON-driven logic that categorizes files into MissingFiles and ProcessFiles.

Flow diagam


Solution Architecture Overview

The solution is designed using 3 OIC integrations and a supporting JSON structure:

{
  "MissingFiles": [
    {
      "FileName": ""
    }
  ],
  "ProcessFiles": [
    {
      "FileName": ""
    }
  ]
}

🔁 Integration 1: Scheduler Integration

  • Purpose: Triggers the flow based on the scheduled time or ad-hoc execution.
  • Steps:
    • Runs on a schedule (typically daily).
    • Accepts a flag adhocExecution = Y/N to override weekday logic.
    • Calls the Main Integration.

🔧 Integration 2: Main File Polling Integration

  • Purpose: List and categorize files from OCI Object Storage.
  • Steps:
    1. List all files from a configured object storage bucket.
    2. Determine required file count based on:
      • Day of the week (Friday = 3, Saturday = 1).
      • adhocExecution = Y allows polling on other days.
    3. Compare expected vs actual files.
    4. Populate a JSON object:
      • ProcessFiles: Files found and ready to process.
      • MissingFiles: Files not found (expected but missing).
    5. For each ProcessFile, invoke the Child Integration.
    6. Log MissingFiles to Datadog using REST API/log collector.

Missed file - throw fault



XSlT code for validating process files vs missed files
<xsl:template match="/xml:id_11">

  <ns1grp:write xml:id="id_117">

    <ns31:request-wrapper>

      <xsl:if test="($Var_dayOfWeek = &quot;Friday&quot;)">

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_IDENTITY.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_IDENTITY.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_CREDITCARD.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_CREDITCARD.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='DLP_CA_NUM.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'DLP_CA_NUM.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

      </xsl:if>

      <xsl:if test="($Var_dayOfWeek = &quot;Saturday&quot;)">

        <xsl:if test="not($FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_NAMES.csv'])">

          <ns31:MissingFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_NAMES.csv'"/>

            </ns31:FileName>

          </ns31:MissingFiles>

        </xsl:if>

      </xsl:if>

      <xsl:if test="($Var_dayOfWeek = &quot;Friday&quot;) or (/nsmpr0:execute/ns17:request-wrapper/ns17:ProcessRequest/ns17:AdhocExecutionFlag = &quot;Y&quot;)">

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_IDENTITY.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_IDENTITY.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_CREDITCARD.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_CREDITCARD.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='DLP_CA_NUM.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'DLP_CA_NUM.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

      </xsl:if>

      <xsl:if test="($Var_dayOfWeek = &quot;Saturday&quot;) or (/nsmpr0:execute/ns17:request-wrapper/ns17:ProcessRequest/ns17:AdhocExecutionFlag = &quot;Y&quot;)">

        <xsl:if test="$FileRef_Var/nsmpr0:response-wrapper/nsmpr0:objects[nsmpr0:name='BP_NAMES.csv']">

          <ns31:ProcessFiles>

            <ns31:FileName>

              <xsl:value-of select="'BP_NAMES.csv'"/>

            </ns31:FileName>

          </ns31:ProcessFiles>

        </xsl:if>

      </xsl:if>

    </ns31:request-wrapper>

  </ns1grp:write>

</xsl:template>  

Integration 3: Child File Processor

  • Purpose: Handles individual file transfer and cleanup.
  • Steps:
    • Download file from OCI Object Storage.
    • Write the file to a local file system via Agent.
    • Delete the file from OCI Object Storage post-processing.


Key Highlights

  • Dynamic logic using weekday and ad-hoc flags.
  • Robust processing pipeline using JSON mapping and loop controls.
  • Clean-up mechanism ensures files aren't reprocessed.
  • Monitoring integration using Datadog for transparency and alerting.


Featured Post

OIC - OIC Utility to Reprocess Failed Real-Time Integration JSON Payloads

📌 Use Case In real-time OIC integrations, JSON payloads are exchanged with external systems via REST APIs. When such integrations fail (du...