Wednesday, September 17, 2025

OIC - OIC Utility to Reprocess Failed Real-Time Integration JSON Payloads

📌 Use Case

In real-time OIC integrations, JSON payloads are exchanged with external systems via REST APIs. When such integrations fail (due to downstream errors, connectivity issues, or invalid data), the input payload is saved into an OIC SFTP error folder for recovery.

Manually retrieving, reviewing, and resubmitting these payloads is time-consuming and error-prone.

To simplify recovery, we build an OIC Utility Service that:

  • Takes Interface ID and Payload File Name as inputs
  • Fetches error folder path and target REST endpoint (relative path) from a Lookup
  • Reads the failed payload file from the error folder
  • Encodes and decodes the payload as Base64
  • Calls the dynamic downstream REST service
  • Resubmits the payload as binary JSON for seamless reprocessing

This ensures that failed real-time integrations can be reprocessed quickly and reliably, without manual intervention.


🛠️ Solution Steps

1. Create a Lookup for Metadata

Define a Lookup in OIC with mappings for each interface:

  • Interface ID → Error Folder Path → Relative REST Endpoint Path
    Example:
HCM_IFC    | /u01/error/hcm/json    | /hcm/v1/worker
Payroll_IFC| /u01/error/payroll/json| /payroll/v2/run

2. Design the Utility App-Driven Orchestration

Trigger the utility with a REST endpoint that accepts:

  • Interface ID
  • Payload File Name


3. Fetch Error Path and REST Endpoint from Lookup

  • Use Lookup functions to dynamically retrieve:
    • Error folder path
    • Relative endpoint URI

4. List Files in Error Folder

  • Use File Server (SFTP) action to list and fetch the payload file based on the provided name.
  • Capture the fileReference from the file action (pointer to the file inside OIC).

5. Read and Encode Payload

  • Use Read File (File Server Action) → Get the file content using fileReference.
  • Encode the payload into Base64, then decode back to binary JSON inside OIC.

6. Call Dynamic REST Service

  • Use HTTP Adapter with dynamic configuration:
    • Base URL → from OIC Connection
    • Relative Path → from Lookup
  • Pass the decoded JSON payload as the request body.





7. Handle Logging & Tracking

  • Log success/failure at each step (File found, Payload resubmitted, REST service status).
  • Update monitoring dashboard or custom tables for auditing.

Benefits

  • Automated reprocessing of failed JSON payloads
  • Dynamic & reusable across multiple interfaces via Lookup
  • Reduces manual errors in resubmission
  • Improves system reliability & recovery time for real-time integrations


Tuesday, September 16, 2025

OIC - Building a Utility Service in OIC to Reprocess Failed Files

📌 Use Case

In many OIC (Oracle Integration Cloud) projects, integrations involve file-based processing where files are picked from an inbound location and processed further.

However, some files may fail due to validation issues, network errors, or downstream service unavailability. Typically, failed files are moved into an error folder.

Instead of manually moving and reprocessing files, we can create a reusable utility App-Driven Orchestration in OIC that:

  • Takes input parameters: Interface ID, File Name, and File Processing Date
  • Identifies inbound and error folders from a lookup using the Interface ID
  • Lists all failed files in the error folder
  • Moves files back to the inbound folder
  • Calls the next integration service dynamically (via absolute endpoint URI)
  • Passes the required parameters to retry the file processing automatically

This makes reprocessing automated, consistent, and faster.


🛠️ Solution Steps

1. Create Lookup for Folder Paths

  • Define a Lookup in OIC with mappings:
    • Interface ID → Inbound Folder Path → Error Folder Path
  • Example:
    Payroll_IFC | /u01/inbound/payroll | /u01/error/payroll
    HCM_IFC     | /u01/inbound/hcm     | /u01/error/hcm
    

2. Design the App-Driven Orchestration (Utility Service)

  • Triggered by a REST endpoint that takes:
    • Interface ID
    • File Name
    • File Processing Date


3. Fetch Folder Paths from Lookup

  • Use OIC Lookup functions to fetch Error Folder and Inbound Folder based on the provided Interface ID.

4. List Files in Error Folder

  • Call FTP / File Adapter to list files from the error folder.
  • Apply filter by File Name + Date if provided.

5. Move Files from Error → Inbound

  • For each matching file:
    • Use File Adapter (Read/Write) or FTP Move operation
    • Move file from the error folder to the inbound folder

6. Call the Next Integration

  • Configure an HTTP Adapter to call the downstream OIC service (absolute endpoint).
  • Pass parameters like:
    • File Name
    • Processing Date
    • Interface ID
  • This re-triggers the main integration flow as if the file was newly dropped in inbound.


7. Handle Errors and Logging

  • Add tracking:
    • Success → File reprocessed successfully
    • Failure → Log reason (e.g., file not found, service unavailable)
  • Store logs in OIC Activity Stream or custom log file.

✅ Benefits

  • Fully automated reprocessing of failed files
  • No manual intervention needed
  • Reusable utility → works across multiple integrations
  • Lookup-driven → easy to extend for new interfaces


OIC - Create a large test file by exponential doubling using Windows batch or command

 Use case

You need a large text file quickly for testing integrations, stress-testing parsers, load tests, or demoing file-handling logic. The simplest approach on Windows is to start with a single line and repeatedly concatenate the file with itself — this doubles the line count each pass and reaches large sizes fast (exponential growth).


Explanation:

This batch script creates an output file with a single sample line and then doubles its size repeatedly by concatenating the file into a temporary file and back. Because every iteration multiplies the line count by two, you quickly reach tens or hundreds of thousands of lines with only a small number of iterations. The example below runs 17 doubling passes, producing 2^17 = 131,072 lines.


Code (copy-paste into a .bat file or .cmd file)

@echo off
setlocal EnableDelayedExpansion

:: Line content
set "line=This is the sample line"

:: Output file (adjust path if needed)
set "outfile=%USERPROFILE%\Desktop\output.txt"
if exist "%outfile%" del "%outfile%"

:: Start with 1 line
echo %line%>"%outfile%"

:: Double the file until it reaches ~100,000+ lines
for %%i in (1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17) do (
  type "%outfile%" >> "%outfile%.tmp"
  type "%outfile%.tmp" >> "%outfile%"
  del "%outfile%.tmp"
)

echo Done!
pause


Step-by-step solution / how it works

  1. Set up environment

    • @echo off hides command echoing.
    • setlocal EnableDelayedExpansion enables advanced variable handling (safe practice for scripts that modify variables in loops).
  2. Define the sample line

    • set "line=This is the sample line" — change the text inside quotes to whatever content you want repeated.
  3. Define and clear output file

    • set "outfile=%USERPROFILE%\Desktop\output.txt" places file on Desktop; change path as required.
    • if exist "%outfile%" del "%outfile%" removes any previous file with the same name.
  4. Seed file

    • echo %line%>"%outfile%" creates the file with a single line to start.
  5. Double the file repeatedly

    • The for %%i in (...) do (...) loop runs 17 times (you can change the number to control final size).
    • Inside loop:
      • type "%outfile%" >> "%outfile%.tmp" writes the current file content into a temp file (one copy).
      • type "%outfile%.tmp" >> "%outfile%" appends that temp file back to the original — now the original contains the old content plus the appended copy → doubled size.
      • del "%outfile%.tmp" removes the temp file.
  6. Finish

    • echo Done! informs completion and pause keeps the console open to view the message (press any key to exit).

How to choose number of iterations

  • Start with 1 line. Each iteration doubles the line count: after n iterations you have 2^n lines.
    • 10 iterations → 1,024 lines
    • 16 iterations → 65,536 lines
    • 17 iterations → 131,072 lines (the sample script)
  • If you want a specific line count, choose n = ceil(log2(desired_lines)).

Tips & cautions

  • Disk space & memory: big files can consume significant disk space and may be slow on slow disks. Use caution on low-storage systems.
  • Encoding: echo/type produce ANSI encoding by default. If you need UTF-8, consider using PowerShell (example below) or ensure callers handle ANSI.
  • Permissions: run in a location where you have write access (Desktop is safe for user-run scripts).
  • Performance: doubling is fast but each type reads and writes the whole file — for extremely large sizes, consider streaming approaches or generating lines programmatically.
  • Cleanup: remember to delete the generated file after tests.

Thursday, September 11, 2025

OIC - Generating SAS Token for Azure Hub Access in OIC Using Built-in Functions, not using crypto library

📌 Use Case

When integrating Oracle Integration Cloud (OIC) with Azure Event Hub / Service Bus / IoT Hub, authentication requires a Shared Access Signature (SAS) token.

  • This token is generated from:
    • Resource URI (sr)
    • Expiry time (se)
    • Shared Access Key Name (skn)
    • Shared Access Key (saKey)
  • The signature (sig) must be an HMAC-SHA256 hash of the resource URI and expiry, encoded in Base64 and URL-safe.

Instead of relying on external crypto libraries, we can leverage OIC’s built-in oic.crypto.hmacsha256 function to securely generate this SAS token inside integration code.


🛠 Solution Steps

1. Define Hex → Base64 URL-safe Converter

The Azure signature must be Base64 URL-encoded. In OIC JS functions, the HMAC result is hex, so we first convert it:

function hexToBase64UrlEncoded(hexString) {
  // Convert hex to byte array
  var bytes = [];
  for (var i = 0; i < hexString.length; i += 2) {
    bytes.push(parseInt(hexString.substr(i, 2), 16));
  }

  // Base64 character set
  var base64Chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/';
  var base64 = '';
  var padding = '=';

  // Process every 3 bytes into 4 base64 characters
  for (var i = 0; i < bytes.length; i += 3) {
    var byte1 = bytes[i];
    var byte2 = i + 1 < bytes.length ? bytes[i + 1] : 0;
    var byte3 = i + 2 < bytes.length ? bytes[i + 2] : 0;

    var triplet = (byte1 << 16) | (byte2 << 8) | byte3;

    base64 += base64Chars[(triplet >> 18) & 0x3F];
    base64 += base64Chars[(triplet >> 12) & 0x3F];
    base64 += i + 1 < bytes.length ? base64Chars[(triplet >> 6) & 0x3F] : '=';
    base64 += i + 2 < bytes.length ? base64Chars[triplet & 0x3F] : '=';
  }

  // URL-encode the Base64 string
  return encodeURIComponent(base64);
}

2. Generate SAS Token in OIC Build Function

This function assembles the SAS token using OIC’s built-in crypto support:

function GetAzureHubAccessTokenOIC(uri, saName, saKey) {
  if (!uri || !saName || !saKey) {
    throw new Error("Missing required parameter");
  }

  var encoded = encodeURIComponent(uri);
  var now = new Date();

  // Token validity: 1 week
  var week = 60 * 60 * 24 * 7; // in seconds
  var ttl = Math.round(now.getTime() / 1000) + week;

  // String to sign
  var signature = encoded + '\n' + ttl;

  // HMAC-SHA256 using OIC built-in function
  var hashCode_value = oic.crypto.hmacsha256(signature, saKey);

  // SAS Token format
  var sasToken =
    "SharedAccessSignature sr=" + encoded +
    "&sig=" + hexToBase64UrlEncoded(hashCode_value) +
    "&se=" + ttl +
    "&skn=" + saName;

  return sasToken;
}
Code screenshot:


3. Output SAS Token

The function returns a SAS token like:

SharedAccessSignature sr=<resource-uri>
&sig=<signature>
&se=<expiry-timestamp>
&skn=<key-name>

Example:

SharedAccessSignature sr=https%3A%2F%2Fmyeventhubs.servicebus.windows.net%2Fsamplehub
&sig=abcdXYZ123%3D
&se=1726221440
&skn=RootManageSharedAccessKey

Key Takeaways

  • No external crypto library is required — OIC’s built-in oic.crypto.hmacsha256 handles signing.
  • hexToBase64UrlEncoded() ensures the signature is in the correct Base64 URL-safe format.
  • The generated SAS token can be directly used in HTTP headers for Azure Event Hub or Service Bus REST APIs.


Tuesday, September 9, 2025

OIC - How to Extend a Map in an Oracle Integration Cloud (OIC) Accelerator Project

Empowering customizations while preserving upgrade compatibility in OIC accelerator integrations


Use Case: When and Why to Extend a Map

Imagine your OIC accelerator integration was originally designed to sync Employee Data between your HR system and a downstream service. The default mapping includes fields like EmployeeID, FirstName, and LastName. Now, your business needs to include Department Code and Work Location in the mapping, which are not in the original integration flow.

To meet this requirement without modifying the original accelerator, and to keep your extension safe across future updates, you should extend the integration through an Extension Group—Oracle’s recommended, upgrade-safe mechanism for customizations.


Solution Steps: How to Do It (Accurately Guided by Oracle's Documentation)

1. Open the Accelerator Project and Select the Integration

  • In the OIC Projects pane, locate your accelerator project (marked with "Accelerator" and "Oracle") .
  • Navigate to the integration you wish to modify.
  • From the Actions menu, choose Extend—this will initiate the extension mode (note: this option is exclusive to accelerator projects) .

2. Insert an Extension Group at the Appropriate Point

  • Locate the step in the integration flow where you want the extended mapping to take place (e.g., before or after an existing Map or Invoke action).
  • Click the Add (+) icon at that point and select Extension Group, or choose Extend before / Extend after using the Actions menu inside the relevant action block .

3. Add the Map Action to the Extension Group

  • Inside the newly created Extension Group, click the Add icon or Integration actions menu and choose Map to create an ad-hoc mapping action .
  • The OIC mapper will open; drag and drop your new fields from source (e.g., DepartmentCode, WorkLocation) to the corresponding targets, applying any transformations as needed.
Notes for Extending a Map in OIC

1. Drag and Map Elements to get new namespace prefix
Select the required elements from the source and map them to the corresponding target fields.

2. Custom DVM Mappings to get new namespace prefix
Use a Domain Value Map (DVM) wherever code-to-description or lookup translations are required as custom dvm, namespace can cause issue in extended map.

3. Verify and add Namespaces and Prefixes to initial xslt map
Check if new namespaces or prefixes are introduced in the extended schema and ensure they are consistent.

4. Update XSLT in Notepad++
Copy the initial map’s XSLT code into Notepad++. Update namespace prefixes if needed and replace the templete to templete whole code in the extended map to ensure proper references.

5. Add New Elements
Introduce any additional elements required by the business use case and map them appropriately.

4. Additional Customization (Optional But Powerful)

You can further enhance your extension with other actions, such as:

  • Data Stitch: To merge multiple payloads or variables.
  • For-Each: To process repeated elements.
  • Switch: To implement conditional routing within your flow.
  • Integration Invoke: To call child integrations.
  • Global Variables, Lookups, or JavaScript Libraries: For reusable variables, code translation/lookup logic, or custom script-based logic.

5. Save, Validate, and Activate

  • Once your extended mapping (and any extra actions) are configured, save the integration.
  • Validate the mapping; OIC will surface any missing or invalid connections.
  • Finally, activate the integration to apply your extension.

6. Preserve Your Extensions During Upgrades

One of the biggest advantages of using an Extension Group is that your customizations remain intact during future accelerator upgrades:

  • When a new version of the accelerator becomes available, you can choose to Merge latest extensions during the installation. Oracle will automatically apply your Extension Group customizations to the upgraded version.
  • Alternatively, if you skip automatic merging, you can still manually merge your extensions into the new version later.

Blog Summary: Why Use Extension Groups for Map Customization

Benefit Description
Upgrade-safe Your custom mapping stays intact during accelerator updates.
Structured customization Extensions are isolated in their own group—easy to manage and modify.
Flexible extensibility Besides maps, you can add loops, lookups, global vars, child integrations, and more.
Low-impact No changes to the original accelerator code, minimizing risks.


Thursday, September 4, 2025

OIC - SAS Token Generation for Azure Event Hub REST API Authorization in OIC

📌 Use Case

When calling Azure Event Hub REST APIs from Oracle Integration Cloud (OIC), authentication requires a Shared Access Signature (SAS) token.

  • SAS token = authorization key generated using HMAC SHA256 hashing.
  • OIC does not provide native functions to generate this.
  • Solution → Create a custom JavaScript library to build SAS token dynamically and inject it into OIC REST calls.

SAS key vs SAS Token:

An SAS Key (Shared Access Signature Key) is a security credential used in Microsoft Azure to grant limited, time-bound access to resources like Event Hubs, Blob Storage, Queues, Service Bus, etc.

🔑 How it works:

  • When you create an Azure resource (like an Event Hub namespace), Azure generates Access Keys for it.
  • These are usually two keys: Primary Key and Secondary Key.
  • Using one of these keys, you (or your code) can generate an SAS Token.
  • The SAS Token contains:
    • Resource URI (what you want to access)
    • Expiry time (when the token becomes invalid)
    • Signature (HMAC-SHA256 signed using the SAS Key)

👉 The SAS Key is the secret you store securely, and from it you generate SAS Tokens that your app or OIC flow uses in the Authorization header.

⚠️ Important:

  • Never expose the SAS Key directly in your apps or clients.
  • Always generate SAS Tokens from it and use those instead.
Solution - FLow diagram:


⚙️ Solution Steps

1. Build the OIC Custom Library

  • Download CryptoJS v3.1.2 from GitHub → CryptoJS v3.1.2.
  • Copy content from:
    • rollups/hmac-sha256.js
    • components/enc-base64.js
  • Append the following function at the bottom:
function createSharedAccessTokenUpd(uri, saName, saKey) {
    if (!uri || !saName || !saKey) {
        throw new Error("Missing required parameter");
    }

    var encoded = encodeURIComponent(uri);
    var now = new Date();
    var week = 60 * 60 * 24 * 7; // 1 week in seconds
    var ttl = Math.round(now.getTime() / 1000) + week;
    var signature = encoded + '\n' + ttl;

    var hash = CryptoJS.HmacSHA256(signature, saKey);
    var hashInBase64 = CryptoJS.enc.Base64.stringify(hash);

    var sasToken = "SharedAccessSignature sr=" + encoded +
                   "&sig=" + encodeURIComponent(hashInBase64) +
                   "&se=" + ttl +
                   "&skn=" + saName;

    return sasToken;
}
  • Save file as OICHmacSHA256.js.
  • Upload it into OIC Libraries.



2. Generate SAS Token inside OIC Mapping

  • In the integration, open your mapping canvas for the REST call.
  • For the Authorization header, call the custom JS function:
createSharedAccessTokenUpd(
  "https://<your-namespace>.servicebus.windows.net",
  "DefaultFullSharedAccessSignature",
  "<your-shared-access-key>"
)

Here, the function dynamically generates the SAS token and places it in the Authorization header.


3. Mapping in the Integration XML

In the integration file, OIC internally translates this mapping into XSLT/XML. Example:

<ns25:StandardHttpHeaders>
  <ns25:Authorization>
    <xsl:value-of select="ora:js: createSharedAccessTokenUpd(
      &quot;https://test-ns.servicebus.windows.net&quot;,
      &quot;DefaultFullSharedAccessSignature&quot;,
      &quot;wjoLBJ...= &quot; )"/>
  </ns25:Authorization>
</ns25:StandardHttpHeaders>

This ensures every REST call to Azure Event Hub uses the correct SAS token dynamically generated at runtime.


Summary

  • Built a custom OIC library with CryptoJS HMAC-SHA256 + Base64.
  • Added SAS token generator function (createSharedAccessTokenUpd).
  • Called the function in OIC mapping → populated the Authorization header.
  • Verified via XSLT/XML that the SAS token gets injected into the REST API call.

This approach ensures secure and reusable Azure Event Hub connectivity from OIC.



Wednesday, September 3, 2025

XSD - Difference Between ref and type in XSD

Using type (local definition)

Here, we define elements directly with their datatypes.

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
           targetNamespace="http://example.com/auth"
           xmlns="http://example.com/auth"
           elementFormDefault="qualified">

  <!-- Local elements -->
  <xs:element name="UserName" type="xs:string"/>
  <xs:element name="Password" type="xs:string"/>

</xs:schema>

XML Output (local namespace elements):

<auth:UserName xmlns:auth="http://example.com/auth">usr</auth:UserName>
<auth:Password xmlns:auth="http://example.com/auth">pwd</auth:Password>

Using ref (reusing global elements)

First, define global reusable elements in a common schema.

common.xsd

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
           targetNamespace="http://example.com/common"
           xmlns="http://example.com/common"
           elementFormDefault="qualified">

  <!-- Global elements -->
  <xs:element name="MessageId" type="xs:string"/>
  <xs:element name="Timestamp" type="xs:string"/>

</xs:schema>

Now, reference those in the service schema.

service.xsd

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
           targetNamespace="http://example.com/service"
           xmlns:svc="http://example.com/service"
           xmlns:common="http://example.com/common"
           elementFormDefault="qualified">

  <!-- Import the common schema -->
  <xs:import namespace="http://example.com/common" schemaLocation="common.xsd"/>

  <!-- Reference global elements -->
  <xs:element ref="common:MessageId"/>
  <xs:element ref="common:Timestamp"/>

</xs:schema>

XML Output (reused common namespace elements):

<common:MessageId xmlns:common="http://example.com/common">12345</common:MessageId>
<common:Timestamp xmlns:common="http://example.com/common">2025-09-03T10:00:00</common:Timestamp>

3. Combined SOAP Example

Both styles used together in a real request:

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"
                  xmlns:auth="http://example.com/auth"
                  xmlns:common="http://example.com/common">
   <soapenv:Header/>
   <soapenv:Body>
      <auth:LoginRequest>
         <!-- Local elements (defined with type) -->
         <auth:UserName>usr</auth:UserName>
         <auth:Password>pwd</auth:Password>

         <!-- Global reusable elements (referenced with ref) -->
         <common:MessageId>12345</common:MessageId>
         <common:Timestamp>2025-09-03T10:00:00</common:Timestamp>
      </auth:LoginRequest>
   </soapenv:Body>
</soapenv:Envelope>

✅ Summary

  • type → Local definition, used for service-specific fields (UserName, Password).
  • ref → Reference to global elements, used for shared fields (MessageId, Timestamp).

This way you can mix both:

  • Use type for fields unique to a service.
  • Use ref for fields that must come from a common namespace.



Featured Post

OIC - OIC Utility to Reprocess Failed Real-Time Integration JSON Payloads

📌 Use Case In real-time OIC integrations, JSON payloads are exchanged with external systems via REST APIs. When such integrations fail (du...