Saturday, December 30, 2023

OIC ERP - How to download Import Payables Invoices Report using ERP cloud Adapter from Oracle Integration


Here, we will see how we can download the import Payables Invoices report. That is we will download report equivalent .xml file which contains same details like the report.

Why do we need :

Note: By default, when we do import AP standard invoice and enabled callback, using import payable invoice process request id if we download ess job exec details, it will only give a .log file but not actual execution report.

High level steps:

  • Create an app driven intregation and configure to subscribe callback event
  • If summarystatus and import payables invoice succeeded, store the import payables invoices request id to a variable
  • Call cloud ERP adapter and do submitESSJobRequest operation of ErpIntegrationService to run the report and get the request id as response.
  • Take a while loop and check getESSJobStatus 
  • Once submitESSjob gets success, downloadESSJobExecutionDetails using cloud erp adapter
  • Write the zipped file to stage location using opaque schema
  • Unzip the files using stage
  • Map the xml file(putting a predicate condition as fileType='xml' in xpath) to send as Notification attachment.
Ess report package details:
jobPackageName: "/oracle/apps/ess/financials/payables/invoices/transactions/"
jobDefinitionName: "APXIIMPT_BIP"

Detailed steps with screenshots:

Integration flow:

Subscribe callback event:

Assign global variable for request id, ess job package name, path, loop counter and ess job status etc.

Switch to check summarystatus

Store import payables invoices process request id.

Configure submit ess job request and map the import request id and ess job package definition and name.

While loop to check ess job status

Configure getEssJobStatus and map the request id which the response of the submitEssJobReq.

Update status and counter to loop.

Put a wait time.

Once ess job gets succeeded,  configure downloadEssJobExecutionDetails and map the request id of the submitEssJobReq.

Take a stage to write the zipped file.

Unzip the files.

Take a notification and add the details required and attached the .xml file in the attament area. We need to put a predicate condition fileType ='xml' in the xpath to read the .xml file.

After test:

Monday, December 25, 2023

ODI 12C - How to clean Cache to start ODI studio

Sometimes when you will try to start odi studio from console or terminal, you will observe that its not getting started and in that situation, the simple solution is to clean the cache before starting odi studio.

Following command to clean the cache:

./ -clean -initialize



After cleaning the cache, if you run the following command it will start the odi


Saturday, December 9, 2023

OIC ERP - When to use Fusion Accounting Hub (FAH) vs GL Journal integration?

When to use Fusion Accounting Hub (FAH) vs GL Journal integration?

When GL :

For the third party source system (such as Payroll or Treasury , etc.) which provides direct transformed accounting transactions or data, we can use GL Journal Import interface. We can build a custom integration by importing a periodic data file from the source system and insert the data validation and populate the GL journal fbdi template. We can then upload the template and import the GL Journal.

When FAH:

The source systems which are unable to provide transformed accounting data, we can use Fusion Accounting Hub (FAH) -  create the subledger source for the 3rd party system and configure the account transformation rules. Then import the transaction data using the fbdi template.

Once transactions are accounted in FAH sub- ledger, there are 2 ways to transfer these accounted entries to Oracle General Ledger. 

  • One is to transfer at once by submitting the individual Journal Import.
  • the other one is to submit it as part of the Create Accounting Process itself (Create Accounting => Accounting Program => Journal Import). With the second option, system submits journal import for each of the child accounting programs i.e. concurrency achieved at the journal import level. This can save up to 30% of total time and works within an interface.

Tuesday, December 5, 2023

OIC - ERP - Import AP Invoice to Cloud ERP using FBDI | Detailed steps to import AP Invoice to Cloud ERP

Usecase: Here, we will demonstrate the detailed implementation steps for AP Invoice FBD Import from polling the file from source >> create FBDI file >> to import to ERP >> to Subscribe Callback.

Source Invoice file structure:

Download the FBDI template: 

oracle site (ERP >>Financials >>select 23D or latest >>Administer >> Import data >>Get started with file-based data import >>payables >> payables standard Invoice Import >>xslm template section.

Implementarion steps:

  1. Poll the Invoice file from Source. We can poll from some SFTP server or calling a rest service or others. In this case, we are calling a child integration which will get the files using rest call. ( Put a switch and check If there is any failure , then throw new fault with exception code, reason and details to scope handler. 
  2. Iterate over the polled files using loop action.(below all steps under a body scope)
    1. For each file, read file reference using stage,  decodebase64ToReference() and Sample CSV file.
      1. Iterate over each record
        1. Take a assign to take line and amount counter for validation.
        2. If line counter >3 then(in the source file, 1st 3 records are only header fields that we will skip)
          1. Check If C1 = Invoice then
            1. Get unique Invoice id by calling BI Report. (How to create OTBI report: and how to call BI report from OIC (
            2. Add total invoice amount for each header. varInvoiceAmount = add(varInvoiceAmount,eachHeaderamount). This is required for file validation.Use this custom function to add numbers to have 2 decimal places.
            3. Write header file using FBDI Invoice header template. File name: ApInvoicesInterface.csv(Mandatory mapped fields: Invoice id(unique id generated from BI report), Business Unit, Source, Invoice Nunber, Invoice Amount, Invoice Date(YYYY/MM/DD format), supplier Name, Supplier Number, Supplier Site, Invoice Currency, Description, Invoice type, Legel entity, Payment terms, Pay Group, Payment reason code, payament reason comments)
          2. Check if C1 = Invoice Line then,
            1. Write Line FBDI File using template. File name: ApInvoiceLinesInterface.csv  and Mandatory mapped fields: Invoice id (same BI fetches Unique and sequential Id), Line Number, Line Type like ITEM, Amount, Invoice Quantity, Unit Price, Description, Dustribution Combination, Tax Classification Code.
          3. Check if C1 =Invoice Trailer then
            1. Take file records count and total file amount which is needed for validation.
    2. Create a ZIP manifest file using a stage for invoice header and Line files.
    3. Properties file : This file contains import job package name, job definition, zip file prefix and job parameters to be passed. Check my below blog for details. How to get job details and parameters etc.  There are two options: choose any one based on requirements.
      1. Generally we create this .property file and zip with the FBDI header and line files if we use import bulk of oracle cloud erp adapter.  (  
      2. Map the comma separated job parameters while using CRUD erpIntegrationservice under Oracle Cloud ERP adapter. This case, we used 2nd option: 
        1. Mapped job parameters: "#NULL,BusinessUnitID,N,#NULL,#NULL,#NULL,1000,SourceName,#NULL,N,N,LedgerId,#NULL,1"
        2. Job name: /oracle/apps/ess/financials/payables/invoices/transactions/,APXIIMPT
    1. File Validation: 
      1. If Trailer record total lines count = total line counts and total trailer amount = total line amount then its a valid file then
        1. Encrypt and import.: there are generally two different methods using which we can import to ERP using oracle cloud ERP adapter: choose any one based on requirements.
          1. Using bulk import. (
          2. Using CRUD Service ErpIntegrationService>> importBulkData operation: Here we will create a reusable child integration to import. This case we used 2nd option (
        2. Get import job status using cloud erp adapter and erpIntegrationService (
        3. Move source file as zip appedning with import or load ess request id to processing folder for further processing during callback. (Business specific requirement)
      2. Otherwise,
        1. Write the source file to Archive folder and delete from polling folder. (
        2. Create logs and create incident and send email Notification (
        3. Throw new fault to scope handler using tralier fault as code, incident short description as reason and log description as details.
    2.  Callback part : We can follow any of the below methods - default or custom callback. This case we have used custom callback.
      1. Default callback subscribe:
        1. Prerequisite :
        2. Follow this blog to import and enable and subscribe callback (
        3. Download Import Payables Imvoices Report as .xml and send as notification attachemnt -
      2. Custom report callback: Follow this blog:
      3. Erp inbound integration callback subscribe: this below blog will show how to subscribe the events which are raised within the ERP. Will update more details later(?????) (
    3. Error handling:
      1. From body scope , after each invoke, put a switch condition and if exception  exists, throw new fault (customized fault) to scope default fault handler. For example, after invoking get files , import, get ess job status etc.
      2. From scope fault handler:
        1. If import not failed(checking error code), Move file from source direcory to archive directory.
        2. Throw the fault to Global fault
      3. From Global fault: 
        1. Send notification or email
        2. Log fault details to DB or File/FTP location
        3. Create an incident ticket.
        4. Rethtow fault.
To know how to upload PGP key for encryption and decryption follow my below blog:

Monday, December 4, 2023

OIC - Create a resuable integration to write or move and delete files from AWS S3 using Rest

Usecase: Here, we will create a reusable integration for example say OIC_FILE_Handler which will do move and delete file from a specified directory of AWS S3 bucket using rest feature.

It will perform the following operations:

  1. Write file
  2. Delete file
  3. Move file (mixed option 1 and option2)
REST trigger Request JSON:

REST Response JSON:


Implemetation logic steps:
  1. Create a AWS S3 rest connection
  2. Create a App driven orchestration pattern and configure rest to accept the base64 encoded file, source path, delete path, source and delete directories and different flags. 
  3. Assign globals:
    1. varBasePathAndFile : Concat(deleteDirectory  sourceFileName)
    2. varMovePathAndFile : concat(targetDirectory, targetFileName)
    3. varFileref: ""
    4. varMoveComplete : N
    5. varDeletecomplete: N
    6. varFileStatus : "error"
    7. varCounter : 0.0
  4. IF base64FileRef != empty, then decode base64 to file reference using decodebase64ToReference().
  5. If moveFileFlag = Y, call S3 rest service to place the file to the specified  directory and map the decoded file ref to stream reference.
    1. Resource path: /{pathAndFileName}
    2. VERB: put
    3. Request payload as Binary - application/zip
    4. Also, put a retry logic to place the files to S3 3 times
      1. Take a scope
      2. While conditon: varFileStatus ="error" and varCounter <3.0
      3. If success, assign varFileStaus ="success"
      4. Under scope > default fault handler > increment counter +1
    5. Update that varMoveComplete = Y if it is success
  6. IF deleteSourceFileFlag = Y , Call S3 rest service nad configure below:
    1. Resource path: /{pathAndFileName}
    2. Verb: Delete
    3. Map the varBasePathandFileName 
    4. If it is success,  assign deleteComplete =Y
  7. Update the response of the rest - fileMoved, fileDeleted and jobStatus.
  8. Update the exception scope fault status from main scope fault handler


Monday, November 27, 2023

OIC - Get files from AWS S3 using Rest service

Usecase: Here, we will create a reusable integration using which we can fetch the CSV Files (CSV file and and any zip file contents) from AWS s3 based on file directory and file pattern using AWS S3 rest.

Rest response json format:











"OutputCreatedFlag": ""


Brief steps:

  1. Get list of files from Aws s3 rest
  2. For each file check if csv or zip
  3. If csv , download the file ref and store to a stage file and later map the response to response as base64 encoded.
  4. If zip, download the zip file ref and 
    1. If unzipflag= N, write the ref to stage file and later on map the response as base64 encoded.
    2. Otherwise, unzip it, for each file , write to a stage file as ref and latet on map all files ref to response as base64 encoded.

Here, we also put a retry logic to get list of files from s3.

Implementation Logic steps:

  1. Create a AWS S3 rest trigger and invoke connection
    1. Provide rest api base url like
    2. Security as AWS Signature version 4
    3. Provide access key, secrect key, AWS Region, Service name 
  2. Create an app driven integration and configure the trigger as below
    1. Pass Directory name as template param
    2. Pass unzipFlag and filePattern as Query Param
    3. Provide a Json response to send file references and file names.
  3. Take a bodyscope and assign all the Global variables required.
    1. varDirectory : replaceString(directoty,"_","/")
    2. varFileName : fileRef.csv
    3. varWriteOut : /writeOut
    4. varObjectCount : 0
    5. varUnzipDirectories : concat("/unzip",$varObjectCount)
    6. varFilePatternS3 : concat("Contents[?contains(Key,","",filePattern,"",")]")
    7. varOutputCreated : N
    8. varS3FileStatus : error
    9. varS3Counter : 0
  4. Get list of files from the S3 directory:
    1. Take a while action to retry logic and loop it 3 times $varS3FileStatus ="error" and $varS3Countet < 3.0
      1. Take a scope
      2. Configure rest endpoint to get the list of files
        1. Verb: get
        2. Resource uri: /
        3. Add 4 query params
          1. list-type
          2. prefix
          3. delimiter
          4. query
        4. Provide ListBucketResult XML as payload response.
      3. Map the directory and file pattern
      4. Assign success status for the loop for success case
      5. Go to file handler and assign increment the counter.
      6. Adding a global variable to do data stitch the list of files response.
  5. Iterate over each bucket contents or each list file
    1. Update object count
      1. varObjectCount : $varObjectCount +1
      2. varUnzipDirectory : concat("/unzip",$varObjectCount)
      3. varParentFileName: substring-after(Key,$varDirectory)
    2. If CSV file (contains($varParentFileName,".csv")="true"
      1. Call s3 rest endpoint and configure below:
        1. Resource uri: /{fileName}
        2. Verb: Get
        3. Response as Binary Application/octet-stream
      2. Map the Contents Key to fileName.
      3. Write the file stream references to stage file as contents.
      4. Update varOutPutCreated = Y
    3. If ZIP file(contains($varParentFileName,".zip")="true"
      1. Call s3 rest endpoint and configure below:
        1. Resource uri: /{fileName}
        2. Verb: Get
        3. Response as Binary Application/zip
      2. Map the Contents Key to fileName.
      3. IF check unzipFlag = N
        1. Write Zip file ref to stage csv file as contents
        2. Update varOutputCreated as Y
      4. Otherwise
        1. Unzip the zip file using stage
        2. For each unzip file
          1. Write file reference to stage file as contents
          2. Update varOutputcreated as Y
  6. If $varOutputcreated = Y
    1. Read the stage file reference and map the all csv file base64 contents to response
  7. Otherwise
    1. No file, just update varOutputcreated as N
  8. Add scope fault to response exception code, reason and details under default fault handler

Implementation steps (with screenshots):

Create AWS S3 rest connection:

Rest trigger configure:

Assign Globals:

Get list of files from S3:

Featured Post

11g to 12c OSB projects migration points

1. Export 11g OSB code and import in 12c Jdeveloper. Steps to import OSB project in Jdeveloper:   File⇾Import⇾Service Bus Resources⇾ Se...