Friday, December 27, 2024

OIC - Convert a package to a project

To convert a user-developed package into a project in Oracle Integration:
  1. Navigate to Design > Packages.
  2. Hover over the package to convert (accelerator and recipe packages are not eligible).
  3. Click the Actions icon and select Create Project.
  4. Enter a name, keywords, and an optional description.
  5. Choose whether to allow anyone to edit, view, and monitor resources by selecting or leaving the corresponding checkbox unchecked.
  6. Click Create. A message confirms the project creation and provides a link to access it.

The newly created project includes all resources (integrations, connections, lookups, and JavaScript libraries) from the package, leaving the original package unchanged.
Screenshots:






Thursday, December 26, 2024

OIC - Restrictions on Using Stage File Action Operations with the File/Attachment Features of the Connectivity Agent

Restrictions on Using Stage File Action Operations with the File/Attachment Features of the Connectivity Agent

When configuring the stage file action in the Configure Stage File Action wizard, note that there are restrictions on using some operations with the connectivity agent.
  • List File operation: Files uploaded through the connectivity agent are not available with the List File operation.
  • Read Entire File and Read File in Segments operations: Files uploaded through the connectivity agent can only be read with a file reference.
  • Zip File operation: Files uploaded through the connectivity agent are not available with the Zip File operation.
  • Unzip File operation: Files uploaded through the connectivity agent can only be unzipped using a file reference.
Reference:

OIC - "Dynamically Configuring Relative URLs for REST API Invocations in Oracle Integration Cloud" | Dynamically invoke local integration

Use Case:

In Oracle Integration Cloud (OIC), there is often a need to dynamically determine the Relative URI while invoking REST APIs. This is particularly useful in scenarios where integration flows must interact with multiple endpoints or services based on dynamic parameters, such as job names, document IDs, or other runtime variables.

Example Scenario:

A bulk data export integration requires a lookup of specific job names and their corresponding endpoint URIs dynamically during runtime. The target REST API expects the Relative URI to change based on the job being processed, ensuring the correct endpoint is invoked.

Solution:

To achieve this, you can use the lookupValue function within the OIC mapper to resolve the required Relative URI dynamically. Here's how:

1. Setup a Lookup Table in OIC:

Create a lookup table in OIC with columns for "Job Name" and "Integration URL". Populate the lookup with key-value pairs for job names and their corresponding relative URIs.

2. Configure the Mapper:

In the mapper, map the required source variable (e.g., Var_JobName) to the target Relative URI field.

Use the lookupValue function to fetch the URI dynamically based on the job name.

Syntax:

lookupValue("Lookup_Name", "JobName", Var_JobName, "IntegrationURL", "Default_Value")


The dynamic uri which we will be passing from lookup , is as below format:
/ic/api/integration/v1/flows/rest/<integrationIdentifier>/1.0/

Limitation:
All the REST APIs being invoked dynamically must have the same request and response payload structures to ensure seamless mapping and processing in the integration.

Invoke local Integration
The same functionality can be achieved using the "Invoke Local Integration" action when calling multiple local integrations. However, this requires all the local integrations to share the same request and response payload structures for consistent processing.

We have to pass integration code(identifier)and version dynamically from the lookup.






Wednesday, December 25, 2024

OIC - How to add days to date YYYY-MM-DD using prebuilt functions

Use Case: 

Adding a Specific Number of Days to a Database Fetched Date in the Format YYYY-MM-DD

Logic Steps:

  1. Convert the input date (in YYYY-MM-DD format) to a date object using the date() function.
  2. Add the required number of days using the dayTimeDuration("P10D") function (e.g., for 100 days, use "P100D").
  3. Finally, convert the result back to a string using the string() function.

Solution :

string((xsd:date(/nstrgmpr:execute/ns20:request-wrapper/ns20:inputDate) + xsd:dayTimeDuration("P10D")))



Tuesday, December 24, 2024

OIC - Javascript - How to add days to a date (YYYY-MM-DD) format.

Use Case:

This function demonstrates how to add a specific number of days to a date provided in the YYYY-MM-DD format.

Javascript code used:

function addDays(startDate, daysToAdd) {

    // Convert startDate to a Date object

    const rhDate = new Date(startDate.slice(0, 4), startDate.slice(5, 6) - 1, startDate.slice(8));

    rhDate.setDate(rhDate.getDate() + Number(daysToAdd));

    // Format the derived date as YYYY-MM-DD

    const derivedDate = rhDate.getFullYear() + '-' + ('0' + (rhDate.getMonth() + 1)).slice(-2) + '-' + ('0' + rhDate.getDate()).slice(-2);

  return derivedDate;

}

Javascript code snap:


Screeshots:

Flow:


Configure the trigger to take two inputs and one output.





Take the custom defined function and add date and days parameters.

Test:




Monday, December 23, 2024

OIC - Managing Sequence Numbers using lookup in OIC for Dynamic Target File Naming

Use Case:

We have a requirement to create a unique file name by appending the following components:

  • A sequence number retrieved dynamically.
  • A hardcoded value '00000000'.
  • A Hong Kong time-converted timestamp in the format [Y0001][M01][D01][H01][m01][s01].

Solution:

Maintain Sequence Numbers Using a Lookup Table:

  • Create a lookup table (e.g., SCM_INVENTORY_TRANSACTION_SEQ_Lookup) with an initial sequence number.
  • Use the dvm:lookupValue function to retrieve the current sequence number dynamically during integration runtime.
Create a Reusable Integration to Update the Lookup:

  • Develop a reusable integration to increment the sequence number in the lookup by 1 after each successful integration execution.
  • This ensures the sequence number is unique and incremented only when the integration runs successfully.

Generate the Timestamp:

  • Use the ns99:format-dateTime function with the Hong Kong timezone to convert the current datetime into the required format:[Y0001][M01][D01][H01][m01][s01].

Concatenate the Components:

  • Combine the sequence number, hardcoded value '00000000', and the formatted timestamp to generate the unique file name.

Example file name format:

<FileNamePrefix_FirstPeriod>_<SequenceNumber>_<Timestamp>.dat


Solution codes:

Var_DatetimeStamp

ns99:format-dateTime(fn:adjust-dateTime-to-timezone(fn:current-dateTime(),ns0:dayTimeDuration("PT08H00M")), '[Y0001][M01][D01][H01][m01][s01]')

var_runno

fn:concat("00000000",dvm:lookupValue('SCM_INVENTORY_TRANSACTION_SEQ_Lookup', 'INT_ID', $var_interfaceID, 'SEQ', 'NA'))

Var_TargetFileName

concat(dvm:lookupValue('Common_SCM_Interface_Outbound_Lookup','Integration_Id', $var_interfaceID, 'FileNamePrefix_FirstPeriod', 'NA'),$var_runno,'_', $Var_DatetimeStamp,'.dat')

Lookup API and Request payload details:

API: /ic/api/integration/v1/lookups/{name}

verb: PUT

request:

{

"columns": [ "rightnow", "soap" ],

  "rows": [

    {

      "rowData": [   "RN1",   "SOAP1"  ]

    }, {

      "rowData": [    "RN2",   "SOAP2"    ]

    }  ],

  "name": "myLookup"

}

Absolute Endpoint URI:

concat("https://design.integration.ap-region-1.ocp.oraclecloud.com/ic/api/integration/v1/lookups/",<LookpuName>,"?integrationInstance=",<ServiceInstanceNameFromintegrationMetadata>)

Trigger request and response json payloads:

Request:

"ProcessRequest" : {

    "Metadata" : {

        "ParentProcessId" : "123456",

        "Interface_Id" : "INT_XXX_XX_XXXX",

        "ParentProcessStartDateTime" : "2019-09-06T12:41:48.589+00:00",

    },

    "ProcessDetails" : {

        "LookUpName" : "VAL1",

        "SourceColumnName" : "VAL2",

        "TargetColumnName" : "VAL3",

        "SourceColumnValue" : "VAL4",

        "TargetColumnValue" : "VAL5"

    },

    "GenericParameters" : [

        {

            "ParameterName" : "KEY1",

            "ParameterValue" : "VAL2"

        },

        {

            "ParameterName" : "KEY2",

            "ParameterValue" : "VAL2"

        },

        {

            "ParameterName" : "KEY2",

            "ParameterValue" : "VAL2"

        }

    ]

}

Response:

"ProcessResponse" : {

    "Status" : "YES",

    "ProcessDetails" : {

        "LookUpName" : "VAL1",

        "SourceColumn" : "VAL2",

        "TargetColumn" : "VAL3",

        "SourceValue" : "VAL4",

        "TargetValue" : "VAL5"

    }

}

Reference:

https://docs.oracle.com/en/cloud/paas/integration-cloud/rest-api/op-ic-api-integration-v1-lookups-name-put.html

Screenshots for creating reusable lookup update integration:

Integration flow:

Configure trigger request - response.





Configure lookup update API


Map

Test:








OIC - Seamless Integration: SCM to SAP EFMS Inventory Transactions Flow

Usecase:

Here, we will see how to integrate between SCM and SAP applications for inventory transactions.

This flow consists of following integrations:

  1. Scheduler:<InterfaceNumber>_OracleSCMToEFMS_Inventory_Transaction_SCH
    • this will call custom ess job by passing 2 params: datefrom and dateTo. 
    • OIC will use the last successful run date timestamp as the date from and the current date timestamp as the dateto for the data extraction. 
    • It will call the export bulk data process to trigger the BIP to extract the inventory transaction data.
  1. Common:SCM_Common_Interface_Outbound_Callback
    • Callback will be triggered upon the completion of the ESS job by the ERP cloud. This callback will be subscribed by this callback process.
    • This service will feed the job details to the main process.
  2. Main:<InterfaceNumber>_OracleSCMToEFMS_Inventory_Transaction_Main:
    • Decode the data from base64 and tramsform the inventory transaction data
    • Transform the data into fixed length file.
    • Prepare imventrory transaction target file into EFMS format by addig header and trailer.

Scheduler service:
  1. Schedule parameters: 
    1. ToDate : format: YYYY-MM-DD HH:MM:SS
    2. DateFrom : format: YYYY-MM-DD HH:MM:SS
  2. Assign dateTo: format-dateTime(startTime,'[Y0001]-[M01]-[D01] [H01]:[m01]:[s01]')
  3. Invoke erp cloud adapter and configure:
    1. Query create update or delete info
    2. Services - ErpIntegrationService
    3. Operation: exportBulkData
  4. Take a switch and Map erp export bulk data as below
    1. If ToDate = " map as
      1. Job name: /oracle/apps/ess/custom/Procurement/InventoryTransaction,InventoryTransactionSyncToEfms
      2. Param list: Orgcode,#NULL,#NULL,dateFrom,dateTo
      3. Job options: EnableEvent=Y
    2. Else map as below:
      1. Job name: /oracle/apps/ess/custom/Procurement/InventoryTransaction,InventoryTransactionSyncToEfms
      2. Param list: Orgcode,#NULL,#NULL,dateFrom,ToDate
      3. Job options: EnableEvent=Y
  5. Assign params:
    1. dateFrom = $dateTo
    2. toDate = ''
  6. If any error, throw new fault : key as ESSJOBERROR" using lookup where error code and reason is stored. Details van be fetched from the scope fault object.
  7. In the global fault, we are passing error details to Saas app data dog .
Common service:
  1. Subscribe the event
    1. Receive business events raised within ERP Cloud
    2. ERP Integration Outbound Event
    3. Add filter like <xpathExpr xmns:ns0="http://xmlns.oracle.com/apps/financials/commonModules/shared"> $inputVariable/ns0:JOBS[1]/ns0:JOBNAME='Inventory Transaction Sync To EFMS' or ... </xpathExpr>
  2. Assign job name to a variable Var_JobName: /ns15:onEvent/ns9:ErpExportBulkDataEvent/ns9:JOBS[1]/ns9:JOBNAME
  3. Validation: throw new fault - ESSJOBSTATUSFAILED where skip condition : summarystatus='SUCCEEDED'
  4. Invoke main integration and map the following info:
    1. SummaryStatus
    2. DocumentId
    3. Jobs details
    4. Connection properties >> relativeURI: get relative uri based on job name from lookup.
  5. If any error, In the scope fault handle >> throw new fault  - LOCALINTEGRATIONERROR
Main integration:
  1. Call erp cloud adapter and getDocumentForDocumentId - ERPIntegrationService and map subscribed event Document id
  2. Write the content to stage as opaque element using xsd opaque schema.
  3. Unzip the file using stage
  4. For each unzipped ICS File
    1. Take switch and check if filetype = 'csv'
    2. Read the csv file using stage
    3. Write target file to FTP location. It has 3 sections.
      1. Header
      2. Data set
      3. Footer

Screenshots:

Subscribe event :

Download file:





Write content to stage as opaque:




Featured Post

OIC - how can I use XSLT functions to remove leading zeros from numeric and alphanumeric fields?

To remove leading zeros from an numeric field in Oracle Integration Cloud (OIC) using XSLT, you can Use number() Function The number() funct...