Tuesday, December 24, 2024

OIC - Javascript - How to add days to a date (YYYY-MM-DD) format.

Use Case:

This function demonstrates how to add a specific number of days to a date provided in the YYYY-MM-DD format.

Javascript code used:

function addDays(startDate, daysToAdd) {

    // Convert startDate to a Date object

    const rhDate = new Date(startDate.slice(0, 4), startDate.slice(5, 6) - 1, startDate.slice(8));

    rhDate.setDate(rhDate.getDate() + Number(daysToAdd));

    // Format the derived date as YYYY-MM-DD

    const derivedDate = rhDate.getFullYear() + '-' + ('0' + (rhDate.getMonth() + 1)).slice(-2) + '-' + ('0' + rhDate.getDate()).slice(-2);

  return derivedDate;

}

Javascript code snap:


Screeshots:

Flow:


Configure the trigger to take two inputs and one output.





Take the custom defined function and add date and days parameters.

Test:




Monday, December 23, 2024

OIC - Managing Sequence Numbers using lookup in OIC for Dynamic Target File Naming

Use Case:

We have a requirement to create a unique file name by appending the following components:

  • A sequence number retrieved dynamically.
  • A hardcoded value '00000000'.
  • A Hong Kong time-converted timestamp in the format [Y0001][M01][D01][H01][m01][s01].

Solution:

Maintain Sequence Numbers Using a Lookup Table:

  • Create a lookup table (e.g., SCM_INVENTORY_TRANSACTION_SEQ_Lookup) with an initial sequence number.
  • Use the dvm:lookupValue function to retrieve the current sequence number dynamically during integration runtime.
Create a Reusable Integration to Update the Lookup:

  • Develop a reusable integration to increment the sequence number in the lookup by 1 after each successful integration execution.
  • This ensures the sequence number is unique and incremented only when the integration runs successfully.

Generate the Timestamp:

  • Use the ns99:format-dateTime function with the Hong Kong timezone to convert the current datetime into the required format:[Y0001][M01][D01][H01][m01][s01].

Concatenate the Components:

  • Combine the sequence number, hardcoded value '00000000', and the formatted timestamp to generate the unique file name.

Example file name format:

<FileNamePrefix_FirstPeriod>_<SequenceNumber>_<Timestamp>.dat


Solution codes:

Var_DatetimeStamp

ns99:format-dateTime(fn:adjust-dateTime-to-timezone(fn:current-dateTime(),ns0:dayTimeDuration("PT08H00M")), '[Y0001][M01][D01][H01][m01][s01]')

var_runno

fn:concat("00000000",dvm:lookupValue('SCM_INVENTORY_TRANSACTION_SEQ_Lookup', 'INT_ID', $var_interfaceID, 'SEQ', 'NA'))

Var_TargetFileName

concat(dvm:lookupValue('Common_SCM_Interface_Outbound_Lookup','Integration_Id', $var_interfaceID, 'FileNamePrefix_FirstPeriod', 'NA'),$var_runno,'_', $Var_DatetimeStamp,'.dat')

Lookup API and Request payload details:

API: /ic/api/integration/v1/lookups/{name}

verb: PUT

request:

{

"columns": [ "rightnow", "soap" ],

  "rows": [

    {

      "rowData": [   "RN1",   "SOAP1"  ]

    }, {

      "rowData": [    "RN2",   "SOAP2"    ]

    }  ],

  "name": "myLookup"

}

Absolute Endpoint URI:

concat("https://design.integration.ap-region-1.ocp.oraclecloud.com/ic/api/integration/v1/lookups/",<LookpuName>,"?integrationInstance=",<ServiceInstanceNameFromintegrationMetadata>)

Trigger request and response json payloads:

Request:

"ProcessRequest" : {

    "Metadata" : {

        "ParentProcessId" : "123456",

        "Interface_Id" : "INT_XXX_XX_XXXX",

        "ParentProcessStartDateTime" : "2019-09-06T12:41:48.589+00:00",

    },

    "ProcessDetails" : {

        "LookUpName" : "VAL1",

        "SourceColumnName" : "VAL2",

        "TargetColumnName" : "VAL3",

        "SourceColumnValue" : "VAL4",

        "TargetColumnValue" : "VAL5"

    },

    "GenericParameters" : [

        {

            "ParameterName" : "KEY1",

            "ParameterValue" : "VAL2"

        },

        {

            "ParameterName" : "KEY2",

            "ParameterValue" : "VAL2"

        },

        {

            "ParameterName" : "KEY2",

            "ParameterValue" : "VAL2"

        }

    ]

}

Response:

"ProcessResponse" : {

    "Status" : "YES",

    "ProcessDetails" : {

        "LookUpName" : "VAL1",

        "SourceColumn" : "VAL2",

        "TargetColumn" : "VAL3",

        "SourceValue" : "VAL4",

        "TargetValue" : "VAL5"

    }

}

Reference:

https://docs.oracle.com/en/cloud/paas/integration-cloud/rest-api/op-ic-api-integration-v1-lookups-name-put.html

Screenshots for creating reusable lookup update integration:

Integration flow:

Configure trigger request - response.





Configure lookup update API


Map

Test:








OIC - Seamless Integration: SCM to SAP EFMS Inventory Transactions Flow

Usecase:

Here, we will see how to integrate between SCM and SAP applications for inventory transactions.

This flow consists of following integrations:

  1. Scheduler:<InterfaceNumber>_OracleSCMToEFMS_Inventory_Transaction_SCH
    • this will call custom ess job by passing 2 params: datefrom and dateTo. 
    • OIC will use the last successful run date timestamp as the date from and the current date timestamp as the dateto for the data extraction. 
    • It will call the export bulk data process to trigger the BIP to extract the inventory transaction data.
  1. Common:SCM_Common_Interface_Outbound_Callback
    • Callback will be triggered upon the completion of the ESS job by the ERP cloud. This callback will be subscribed by this callback process.
    • This service will feed the job details to the main process.
  2. Main:<InterfaceNumber>_OracleSCMToEFMS_Inventory_Transaction_Main:
    • Decode the data from base64 and tramsform the inventory transaction data
    • Transform the data into fixed length file.
    • Prepare imventrory transaction target file into EFMS format by addig header and trailer.

Scheduler service:
  1. Schedule parameters: 
    1. ToDate : format: YYYY-MM-DD HH:MM:SS
    2. DateFrom : format: YYYY-MM-DD HH:MM:SS
  2. Assign dateTo: format-dateTime(startTime,'[Y0001]-[M01]-[D01] [H01]:[m01]:[s01]')
  3. Invoke erp cloud adapter and configure:
    1. Query create update or delete info
    2. Services - ErpIntegrationService
    3. Operation: exportBulkData
  4. Take a switch and Map erp export bulk data as below
    1. If ToDate = " map as
      1. Job name: /oracle/apps/ess/custom/Procurement/InventoryTransaction,InventoryTransactionSyncToEfms
      2. Param list: Orgcode,#NULL,#NULL,dateFrom,dateTo
      3. Job options: EnableEvent=Y
    2. Else map as below:
      1. Job name: /oracle/apps/ess/custom/Procurement/InventoryTransaction,InventoryTransactionSyncToEfms
      2. Param list: Orgcode,#NULL,#NULL,dateFrom,ToDate
      3. Job options: EnableEvent=Y
  5. Assign params:
    1. dateFrom = $dateTo
    2. toDate = ''
  6. If any error, throw new fault : key as ESSJOBERROR" using lookup where error code and reason is stored. Details van be fetched from the scope fault object.
  7. In the global fault, we are passing error details to Saas app data dog .
Common service:
  1. Subscribe the event
    1. Receive business events raised within ERP Cloud
    2. ERP Integration Outbound Event
    3. Add filter like <xpathExpr xmns:ns0="http://xmlns.oracle.com/apps/financials/commonModules/shared"> $inputVariable/ns0:JOBS[1]/ns0:JOBNAME='Inventory Transaction Sync To EFMS' or ... </xpathExpr>
  2. Assign job name to a variable Var_JobName: /ns15:onEvent/ns9:ErpExportBulkDataEvent/ns9:JOBS[1]/ns9:JOBNAME
  3. Validation: throw new fault - ESSJOBSTATUSFAILED where skip condition : summarystatus='SUCCEEDED'
  4. Invoke main integration and map the following info:
    1. SummaryStatus
    2. DocumentId
    3. Jobs details
    4. Connection properties >> relativeURI: get relative uri based on job name from lookup.
  5. If any error, In the scope fault handle >> throw new fault  - LOCALINTEGRATIONERROR
Main integration:
  1. Call erp cloud adapter and getDocumentForDocumentId - ERPIntegrationService and map subscribed event Document id
  2. Write the content to stage as opaque element using xsd opaque schema.
  3. Unzip the file using stage
  4. For each unzipped ICS File
    1. Take switch and check if filetype = 'csv'
    2. Read the csv file using stage
    3. Write target file to FTP location. It has 3 sections.
      1. Header
      2. Data set
      3. Footer

Screenshots:

Subscribe event :

Download file:





Write content to stage as opaque:




Friday, December 20, 2024

Oracle utilities Cloud - Batch job submission Rest API details

Working ...

Usecase:

In this scenario, we will utilize the provided REST API to submit a batch job to the CCS utilities server and monitor its status. The batch job will facilitate the upload of a file from OCI object storage to the CCS server.

Submit Batch Job

Verb : Post
https://server:port/spl/rest/apis/common/batch/batchJobSubmission/

This operation is used to add a batch job submission record for the input batch control. This API results in a generated Batch Job ID which can be used to review the status

Request Response:
TBD

Get Batch Job / Batch Run Details
Verb: Get
https://server:port/spl/rest/apis/common/batch/batchJobSubmission/{batchJobId}

This operation is used to retrieve the details related to a given batch job ID. Details related to the batch job submission record are returned. If the batch job has an associated batch run, then all the available batch run details are also returned.

Request and Response:

TBD

Reference:

OIC XSLT - Dynamic XPath Mapping in Oracle Integration: Handling Nested Transactions | use of current() function

Use Case Description:

Scenario:
In an integration process, we need to map and extract the ChequeNumber from a specific PaymentRecord dynamically based on the TransactionNumber of the stub account record. The mapping ensures that the correct record is retrieved by comparing the current TransactionNumber in context with the corresponding field in the source data.

Challenge:
Accurately identify and map the ChequeNumber when dealing with nested data structures, ensuring consistency and correctness in dynamic XPath expressions.

Solution:
Use the following XPath expression to achieve this:

$ReadSourceFile/nsmp0:ReadResponse/ns25:Payments/ns25:PaymentRecord[ns25:TransactionNumber=current()/ns25:TransactionNumber]/ns25:ChequeNumber




This approach ensures that the integration maps only the ChequeNumber for the record where the TransactionNumber matches the current context.

Note:
within a template, current() allows you to explicitly reference the node that is being processed without needing to specify its path. This can be helpful when working with recursive templates or when you want to access the current node while traversing through XML documents.


OIC XSLT - Use of fn:format-number() | Formatting Large Amount Values to Two Decimal Places in XSLT

Use Case: Formatting Large Amount Values to Two Decimal Places in XSLT.

Scenario:

We are receiving an amount field value in the following format:

Input Data: 0000567892987

Our goal is to transform this input into a properly formatted number with two decimal places:

Target Output: 5678929.87

solution code:

fn:format-number((number($ReadSourceFile/nsmp0:ReadResponse/ns25:Payments/ns25:FileTrailer/ns25:TotalStubAmount) div 100), "#.00")




Featured Post

OIC - Padding leading zeros to a number field using xslt format-number()

In many payment-related integrations, credit card numbers often arrive as plain numeric strings. For security and compliance—and to meet tar...