Thursday, June 19, 2025

OIC - Designing a Reusable Callback Integration for Multiple FBDI Uploads in Oracle Integration Cloud (OIC)

๐Ÿงพ Use Case Overview

In most Oracle Fusion implementations, File-Based Data Import (FBDI) is a widely used approach to load master and transactional data into Fusion Cloud. Each business object (like Employees, Items, Customers, Daily Rates, etc.) has a unique FBDI template and requires an integration that:

  1. Generates the FBDI ZIP file
  2. Uploads the file to UCM
  3. Submits an ESS Job (e.g., "Load Interface File for Import")
  4. Monitors the ESS job status
  5. Performs post-processing on success/failure

When you’re handling multiple business objects, step 4 and 5 are usually the same across integrations. Repeating this logic in every flow makes it:

  • Redundant
  • Hard to maintain
  • Prone to errors

๐Ÿ‘‰ So why not reuse this logic?


๐ŸŽฏ Goal

To create one common callback integration in OIC that can be invoked from any FBDI integration to:

  • Poll the ESS Job status
  • Handle success/failure
  • Perform downstream processing based on the business object

๐Ÿงฑ Architecture Overview

[ FBDI Integration: Employees     ] \
[ FBDI Integration: Items         ]  \
[ FBDI Integration: Daily Rates   ]   --> [ ๐Ÿ” Common Callback Integration ]
[ FBDI Integration: Customers     ]  /

Each main FBDI flow:

  • Ends by calling the Common Callback Integration
  • Sends a payload with:
    • requestId (ESS Job ID)
    • businessObject (like "EMPLOYEES")
    • fileName, submittedBy, etc.

๐Ÿงฐ Prerequisites

  • Oracle Integration Cloud Gen 2/3
  • ERP Cloud Adapter and SOAP connection to ERPIntegrationService
  • Basic understanding of:
    • FBDI process
    • ESS Jobs in Fusion
    • While/Switch activities in OIC

๐Ÿงญ Detailed Implementation Steps


Step 1: FBDI Integration Flow (Example: Daily Rates)

This is your normal FBDI flow:

  1. Read source data
  2. Transform and generate FBDI .zip file
  3. Upload to UCM using ERP Cloud Adapter
  4. Submit ESS Job using submitESSJobRequest
  5. Capture requestId from the response
  6. Call Common Callback Integration with a payload:
{
  "requestId": "456789",
  "businessObject": "DAILY_RATES",
  "fileName": "DailyRates_20250618.zip",
  "submittedBy": "ManojKumar"
}

Step 2: Create the Common Callback Integration

Integration Type: App-Driven Orchestration
Trigger: REST Adapter (POST operation)

๐Ÿ“ฅ Input JSON Schema:

{
  "requestId": "string",
  "businessObject": "string",
  "fileName": "string",
  "submittedBy": "string"
}

Step 3: Parse and Assign Variables

  • Assign requestId, businessObject, and other fields to local variables.
  • Initialize:
    status = ""
    loopCount = 0
    

๐Ÿ” Step 4: Implement Polling Logic using While Loop

Condition:

status != "SUCCEEDED" AND status != "ERROR" AND loopCount < 20

Inside the loop:

  1. Call getESSJobStatus via ERPIntegrationService SOAP connection
  2. Parse response:
    <JobStatus>SUCCEEDED</JobStatus>
    <Message>Completed successfully</Message>
    
  3. Assign status to local variable
  4. Wait for 1 minute (use Wait activity)
  5. Increment loopCount += 1

๐Ÿง  Step 5: Decision Based on Status

After exiting the loop, check if:

  • status == "SUCCEEDED": proceed with business logic
  • status == "ERROR": log failure and send notification

๐Ÿงช Step 6: Use Switch for Business Object-Specific Logic

Switch on businessObject:
├── "DAILY_RATES"   → Call Daily Rates post-processing
├── "EMPLOYEES"     → Call Employees HDL flow
├── "ITEMS"         → Write data to DB or update flag
├── "CUSTOMERS"     → Trigger BIP report / send confirmation

Use Local Integration Calls or inline logic as needed.


Step7: Output we Can Fetch After getESSJobStatus

When getESSJobStatus completes, the response includes a reportFile or document ID that points to the output/log files. We can fetch:

  1. .log file (execution log)
  2. .out file (output message, summary of load)
  3. .csv error file (for rows that failed)

Call getESSJobExecutionDetails (Optional)

We can invoke another operation (if available) to get details of the child job, if the job is a job set or composite.

Alternative Approach (Preferred):

Use ERPIntegrationService.downloadESSJobExecutionDetails or UCM file download API to download the .log and .out files using requestId.


Use UCM Web Service to Download Files

Once the ESS job runs, output files are stored in UCM. We can Call ERPIntegrationService > downloadExportOutput

Input: requestId >> You’ll get a base64 file content >> Parse it or store it in DB or FTP for audit

Or use WebCenter Content API (UCM API) to list files using requestId and download

Sample Output from .out File (Import Summary)

Total Records Read: 100  
Successfully Imported: 95  
Failed Records: 5  
Log File: import_daily_rates.log

๐Ÿ“ง Step 8: Optional Email Notification

Send an email with:

  • ESS Job Result
  • File name
  • Business object
  • Message or error (if failed)

๐Ÿ“‚ Sample getESSJobStatus Request Payload (SOAP)

<typ:getESSJobStatusRequest>
   <typ:requestId>456789</typ:requestId>
</typ:getESSJobStatusRequest>

Sample Response:

<typ:getESSJobStatusResponse>
   <typ:JobStatus>SUCCEEDED</typ:JobStatus>
   <typ:Message>Completed successfully</typ:Message>
</typ:getESSJobStatusResponse>

๐Ÿšจ Error Handling Strategy

  • If ESS Job fails (ERROR), log:
    • requestId
    • businessObject
    • error message
  • Store in DB or call a notification integration
  • Enable retry if needed

๐Ÿ’ก Best Practices

  • Set a polling limit (e.g., 20 retries = ~20 mins)
  • Avoid infinite loops
  • Use consistent naming conventions for businessObject
  • Create reusable sub-integration flows for downstream processing
  • Add logging and tracking (e.g., via ATP/Logging framework)

๐Ÿš€ Enhancements We Can Add

  • Add DB persistence for incoming callback metadata
  • Scheduled Integration to recheck failed jobs
  • Audit dashboard for all FBDI callbacks
  • Notify users in MS Teams / Slack using Webhook

Conclusion

Building a common callback integration for all FBDI flows:

  • Makes your integrations modular and maintainable
  • Reduces redundancy
  • Centralizes your error handling and monitoring

This pattern can be extended to HCM Extracts, BIP report monitoring, and ESS job chains as well.


๐Ÿ“ฆ Sample Naming Suggestions

Artifact Name
Integration INT_COMMON_ESS_CALLBACK
SOAP Connection ERPIntegrationServiceSOAP
Variable: requestId varRequestId
Variable: loop counter varLoopCount
Email Subject FBDI ${businessObject} - Job ${status}


OIC Gen3 - New Feature - File Polling Feature using FTP trigger in OIC

Unlocking Efficient FTP Triggers: Using the New File‑Polling Feature in Oracle Integration Cloud (OIC Gen3)

Subtitle:
Learn how to automate smaller file reads from FTP servers using the built‑in file‑polling trigger in OIC Gen3 24.10+.

๐Ÿ›  Use Case

Many integration scenarios require processing files placed onto an FTP server—like daily CSV or XML reports—without manual intervention. Prior to OIC Gen3 24.10, triggering on file arrival involved workarounds such as scheduled scripts or custom polling logic.

With the new File‑Polling feature, you can:

  • Trigger OIC integrations based on new files matching a naming pattern.
  • Auto‑load file contents as payload—ideal for lightweight file reads.
  • Configure archive, delete, or reject handling.
  • Avoid downloads, saving bandwidth and simplifying flow.

๐Ÿ”ง Solution Overview: Step‑by‑Step

  1. Ensure Compatibility
    Verify you're running OIC Gen3 version 24.10 or higher—this is when FTP file‑polling became available.

  2. Set Up FTP Connection
    In your OIC connection settings, choose or configure your FTP/SFTP source.

  3. Use File‑Polling Trigger
    In the integration builder, select the “File Polling” trigger. You’ll see options for:

    • Polling frequency (e.g., every 5 minutes)
    • Source directory
    • Filename pattern (e.g., *.csv)
    • Schema type (CSV, XML), plus sample file upload support
  4. File Handling Options
    Decide what happens after triggering:

    • Archive to another folder
    • Move after successful read
    • Delete automatically
    • Ignore delete‑errors to prevent retries
    • Reject invalid files
  5. Design Integration Flow
    After the trigger, use the file’s contents payload to:

    • Parse with a schema
    • Route data to downstream systems
    • Handle errors via reject logic
  6. Test and Validate (POC)
    Always run a proof‑of‑concept:

    • Drop a test file matching your pattern
    • Confirm the integration triggered as expected
    • Validate the post‑processing behavior (archive/move/delete)
  7. Deploy and Monitor
    Once verified, deploy your integration. Monitor success/failure and adjust polling or file‑handling parameters as needed.

The below we see the demonstration how to poll a file:







Saturday, June 14, 2025

OIC - Monitoring and Troubleshooting Integrations in Oracle Integration Cloud (OIC) Gen 3: A Practical Guide

๐Ÿ“˜ Use Case:

As an OIC developer or integration lead, you often need to monitor live integrations for performance, failures, and latency issues. With OIC Gen 3, enhanced tools like Observability dashboard, Projects tab, and Activity Stream help you quickly identify, trace, and resolve issues.

๐Ÿ› ️ Solution Steps: Monitoring & Troubleshooting in OIC Gen 3

๐Ÿ” 1. Observability Dashboard (Home → Observability)

  • View overall system health:
    • Success/failure trends
    • Execution volumes
    • Error frequency by integration
  • Helps you monitor across all projects and integrations.

๐Ÿ“Œ Tip: Use date filters and drill into specific time windows for better root cause analysis.


๐Ÿ“ 2. Projects Tab + Observe Subtab (Projects → [Your Project] → Observe)

The Observe tab under Projects gives real-time, project-specific analytics.

Key Features:

  • Live status of all integrations in the project.
  • Visual indicators for:
    • Failed instances
    • Slow executions
    • Backlogged or retrying flows
  • You can click an integration to view:
    • Run history
    • Payload details
    • Fault/error points

๐Ÿ“Œ Use Case: Great for team-based development — each team can monitor and troubleshoot flows relevant to their assigned project.


๐Ÿ”„ 3. Activity Stream (Monitor → Tracking)

  • Track integration instances using filters:
    • Integration name
    • Status (failed, completed)
    • Date/time range
  • Open individual runs to:
    • View complete execution trace
    • See error messages and transformation values
    • Access input/output payloads

๐Ÿ“Œ Pro Tip: Combine this with the Observe view to jump from high-level KPIs to instance-level diagnostics.


๐Ÿงฐ 4. Diagnostic Logs (Observability → Diagnostic Logs)

  • Search logs using:
    • Integration name
    • Flow instance ID
    • Timestamps
  • Useful for back-end or infrastructure issues, not visible in instance-level logs.

๐Ÿ”” 5. Notifications & Alerts

  • Use Integration Insight or configure external notification logic to send alerts when:
    • Integration fails repeatedly
    • A flow runs unusually long
    • SLA thresholds are breached

๐Ÿงช 6. Replay and Testing

  • For eligible integrations, replay failed instances after correcting data or logic.
  • Also supports test executions from integration canvas for verification before redeploying.



Tuesday, June 10, 2025

OIC - Line-by-Line Data Parsing in OIC Using Custom XSD Schema

Use Case:

When handling files containing rowwise data, such as payment information, it’s crucial to accurately parse each row based on a delimiter—like end-of-line (EOL). In this scenario, the incoming file data is represented in an XML format, where rows (or records) are wrapped within the XML structure and separated by EOL. To enable seamless integration and processing in Oracle Integration Cloud (OIC), we define a custom XSD schema to parse each row correctly and handle these line breaks.


Solution Steps:

✅ 1 Create the XSD Schema for Rowwise Parsing
We’ve designed an XSD schema (payments.xsd) with the following key features:

  • The root element <payments> contains multiple <MyDataSet> elements.
  • Each <MyDataSet> represents a row of data and uses nxsd:cellSeparatedBy="${eol}" to handle EOL-separated entries.
  • Each row contains a single <MyData> element representing the data in that line, with nxsd:style="terminated" and nxsd:terminatedBy="${eol}" to ensure accurate row termination.
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
    xmlns:tns="http://xmlns.sky.com.mx/sib/FormateaEfectivoPagosAnticipados/OficinaNICRFAdptr"
    targetNamespace="http://xmlns.sky.com.mx/sib/FormateaEfectivoPagosAnticipados/OficinaNICRFAdptr"
    elementFormDefault="qualified"
    attributeFormDefault="unqualified"
    nxsd:version="NXSD"
    nxsd:stream="chars"
    nxsd:encoding="UTF-8">

    <xsd:element name="payments">
        <xsd:complexType>
            <xsd:sequence>
                <xsd:element name="MyDataSet" nxsd:cellSeparatedBy="${eol}" minOccurs="1" maxOccurs="unbounded">
                    <xsd:complexType>
                        <xsd:sequence>
                            <xsd:element name="MyData" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy="&quot;" />
                        </xsd:sequence>
                    </xsd:complexType>
                </xsd:element>
            </xsd:sequence>
        </xsd:complexType>
    </xsd:element>

</xsd:schema>

✅ 2 Upload the XSD Schema in OIC

  • Navigate to OIC -> Artifacts -> XSD Schemas.
  • Upload the payments.xsd file to use it for validating and parsing incoming rowwise data files.

✅ 3 Integration in OIC

  • Use the File Adapter or REST Adapter to bring in the file as input.
  • Use the uploaded XSD to parse the file into structured data rows automatically.

Benefits of this Approach:

๐Ÿ”น Accurate Rowwise Parsing – The schema ensures each line is treated as a separate record using EOL delimiters.
๐Ÿ”น Easy Integration – Directly parse and validate rowwise data in OIC, reducing transformation complexity.


OIC - Verifying RSA Signatures Using OCI Functions and Java – A Seamless OIC Integration

Use Case

In today’s cloud-driven world, secure communication is paramount. One critical aspect of security is verifying digital signatures using asymmetric RSA cryptography. In this scenario, you need to:

✅ Validate incoming data from a third-party system using RSA signature verification.
✅ Perform the verification logic using a Java library (with SHA256withRSA).
✅ Deploy this verification as a serverless OCI Function.
✅ Invoke this function securely from Oracle Integration Cloud (OIC) to ensure your integration flows can trust the incoming data.


Solution Overview

We’ll cover:

1️⃣ Writing a Java class (RSASignatureVerifier) to verify RSA signatures.
2️⃣ Deploying this logic as an OCI Function (Java runtime).
3️⃣ Calling this OCI Function from OIC (via the Function Adapter).

This ensures your verification logic is centralized, secure, and easily maintainable.


Step-by-Step Solution

Step 1: Java Class for RSA Signature Verification

Here’s the complete Java code (RSASignatureVerifier.java).

import java.security.*;
import java.security.spec.*;
import java.util.Base64;

public class RSASignatureVerifier {

    /**
     * Verifies an RSA signature.
     *
     * @param data The original data that was signed.
     * @param signatureBase64 The signature in Base64 encoding.
     * @param publicKeyBase64 The RSA public key in Base64 encoding (X.509 format).
     * @param hashType Hashing algorithm to use (e.g., "SHA256withRSA").
     * @return true if the signature is valid, false otherwise.
     * @throws Exception on errors during verification.
     */
    public boolean verifySignature(String data, String signatureBase64, String publicKeyBase64, String hashType) throws Exception {
        // Decode the public key
        byte[] publicKeyBytes = Base64.getDecoder().decode(publicKeyBase64);
        X509EncodedKeySpec keySpec = new X509EncodedKeySpec(publicKeyBytes);
        KeyFactory keyFactory = KeyFactory.getInstance("RSA");
        PublicKey publicKey = keyFactory.generatePublic(keySpec);

        // Decode the signature
        byte[] signatureBytes = Base64.getDecoder().decode(signatureBase64);

        // Initialize the Signature object for verification
        Signature signature = Signature.getInstance(hashType);
        signature.initVerify(publicKey);
        signature.update(data.getBytes("UTF-8"));

        // Verify the signature
        return signature.verify(signatureBytes);
    }

    // Example usage
    public static void main(String[] args) throws Exception {
        RSASignatureVerifier verifier = new RSASignatureVerifier();

        String data = "12345678910";
        String publicKeyBase64 = "MIIBIjANBgkqhkiG9w0BAQEFAAOC..."; // truncated for brevity
        String signatureBase64 = "4BDmT8Qm8c9EQDdKLor7DXN5u..."; // truncated

        boolean isValid = verifier.verifySignature(data, signatureBase64, publicKeyBase64, "SHA256withRSA");
        System.out.println("Signature valid? " + isValid);
    }
}
Screenshot:


Step 2: Deploy as an OCI Function

1️⃣ Create a new OCI Function using the Java runtime.
2️⃣ Package your Java code (including the RSASignatureVerifier class) as a JAR.
3️⃣ Implement the Function’s entry point (handleRequest) to accept JSON with these fields:

{
  "data": "string to verify",
  "signatureBase64": "base64-encoded signature",
  "publicKeyBase64": "base64-encoded public key",
  "hashType": "SHA256withRSA"
}

4️⃣ Parse the JSON, call verifySignature, and return the verification result as JSON.

Here’s a sample Function handler:

import com.fnproject.fn.api.*;
import java.util.Map;

public class VerifySignatureFunction {
    public boolean handleRequest(Map<String, String> input) throws Exception {
        RSASignatureVerifier verifier = new RSASignatureVerifier();

        String data = input.get("data");
        String signatureBase64 = input.get("signatureBase64");
        String publicKeyBase64 = input.get("publicKeyBase64");
        String hashType = input.get("hashType");

        return verifier.verifySignature(data, signatureBase64, publicKeyBase64, hashType);
    }
}

Step 3: Invoke OCI Function from OIC

✅ Use the Oracle Integration Function Adapter.
✅ Configure it to call your OCI Function endpoint.
✅ Pass the required JSON payload dynamically from your OIC integration (e.g., mapping from integration data fields).


Result

๐ŸŽ‰ Your integration flows in OIC now securely validate signatures before proceeding!
๐Ÿ›ก️ You’ve centralized this cryptographic logic in an OCI Function, making it secure and reusable across multiple flows.


Conclusion

๐Ÿ”‘ By combining Java’s cryptography libraries, OCI Functions, and OIC’s Function Adapter, you’ve built a secure, scalable, and maintainable solution for RSA signature verification.

How to generate the rsa public key: use the below online tool:

https://8gwifi.org/RSAFunctionality?rsasignverifyfunctions=rsasignverifyfunctions&keysize=2048

Sunday, June 8, 2025

OIC - Can you explain how you use REST and SOAP connections in OIC?

Answer:

In Oracle Integration Cloud (OIC), REST and SOAP connections are set up to enable communication with external applications or services that expose their APIs in these formats.

๐Ÿ”น REST Connections:

  • I create REST connections using the REST Adapter in OIC.
  • The connection is configured by specifying the base URL, authentication (like OAuth 2.0, Basic Auth), and optional headers.
  • I typically use REST connections to integrate with modern web services, like external APIs or Oracle SaaS REST endpoints (e.g., ERP/HCM Cloud REST APIs).

๐Ÿ”น SOAP Connections:

  • I create SOAP connections using the SOAP Adapter.
  • This involves uploading the WSDL file or providing the WSDL URL.
  • The adapter uses the WSDL to define available operations and data structures.
  • I use SOAP connections for integrations with legacy systems or Oracle SaaS SOAP web services (e.g., certain HCM/ERP services that are still SOAP-based).

Practical Use in OIC Integrations:

  • Once the connections are created, they are used in the integration flow.
  • I drag and drop the REST or SOAP connections onto the integration canvas as invoke actions to send or receive data.
  • In the mapper, I map the incoming or outgoing payloads to match the API’s structure.

OIC - What is the difference between synchronous and asynchronous integrations? Can you give examples in OIC?

Difference:

  • Synchronous (sync) integrations are blocking – the calling system waits for the integration to finish and provide a response. These are best suited for real-time tasks that require immediate feedback.
    ๐Ÿ”น Example in OIC: A REST API-based integration where a frontend application sends a request to get the current stock level of a product in ERP and waits for the response immediately.

  • Asynchronous (async) integrations are non-blocking – the calling system sends the request, and OIC processes it in the background, sending the response separately. These are used for batch processing or long-running tasks where real-time response isn’t needed.
    ๐Ÿ”น Example in OIC: An integration that receives a file (e.g., via FTP adapter) with thousands of records and processes it in the background to load data into Oracle ERP Cloud using FBDI.




Featured Post

OIC - OIC Utility to Reprocess Failed Real-Time Integration JSON Payloads

๐Ÿ“Œ Use Case In real-time OIC integrations, JSON payloads are exchanged with external systems via REST APIs. When such integrations fail (du...