Monday, November 27, 2023

OIC - Get files from AWS S3 using Rest service

Usecase: Here, we will create a reusable integration using which we can fetch the CSV Files (CSV file and and any zip file contents) from AWS s3 based on file directory and file pattern using AWS S3 rest.

Rest response json format:

{

"fileReferences":[{

"base64FileReference":"base64data",

"fileName":"test.csv",

"parentFileName":"",

"base64ParentFileReference":""

}],

"ExceptionCode":"",

"ExceptionReason":"",

"ExceptionDetails":"",

"OutputCreatedFlag": ""

}

Brief steps:

  1. Get list of files from Aws s3 rest
  2. For each file check if csv or zip
  3. If csv , download the file ref and store to a stage file and later map the response to response as base64 encoded.
  4. If zip, download the zip file ref and 
    1. If unzipflag= N, write the ref to stage file and later on map the response as base64 encoded.
    2. Otherwise, unzip it, for each file , write to a stage file as ref and latet on map all files ref to response as base64 encoded.

Here, we also put a retry logic to get list of files from s3.

Implementation Logic steps:

  1. Create a AWS S3 rest trigger and invoke connection
    1. Provide rest api base url like https://host.s3.us-west-2.amazonaws.com
    2. Security as AWS Signature version 4
    3. Provide access key, secrect key, AWS Region, Service name 
  2. Create an app driven integration and configure the trigger as below
    1. Pass Directory name as template param
    2. Pass unzipFlag and filePattern as Query Param
    3. Provide a Json response to send file references and file names.
  3. Take a bodyscope and assign all the Global variables required.
    1. varDirectory : replaceString(directoty,"_","/")
    2. varFileName : fileRef.csv
    3. varWriteOut : /writeOut
    4. varObjectCount : 0
    5. varUnzipDirectories : concat("/unzip",$varObjectCount)
    6. varFilePatternS3 : concat("Contents[?contains(Key,","",filePattern,"",")]")
    7. varOutputCreated : N
    8. varS3FileStatus : error
    9. varS3Counter : 0
  4. Get list of files from the S3 directory:
    1. Take a while action to retry logic and loop it 3 times $varS3FileStatus ="error" and $varS3Countet < 3.0
      1. Take a scope
      2. Configure rest endpoint to get the list of files
        1. Verb: get
        2. Resource uri: /
        3. Add 4 query params
          1. list-type
          2. prefix
          3. delimiter
          4. query
        4. Provide ListBucketResult XML as payload response.
      3. Map the directory and file pattern
      4. Assign success status for the loop for success case
      5. Go to file handler and assign increment the counter.
      6. Adding a global variable to do data stitch the list of files response.
  5. Iterate over each bucket contents or each list file
    1. Update object count
      1. varObjectCount : $varObjectCount +1
      2. varUnzipDirectory : concat("/unzip",$varObjectCount)
      3. varParentFileName: substring-after(Key,$varDirectory)
    2. If CSV file (contains($varParentFileName,".csv")="true"
      1. Call s3 rest endpoint and configure below:
        1. Resource uri: /{fileName}
        2. Verb: Get
        3. Response as Binary Application/octet-stream
      2. Map the Contents Key to fileName.
      3. Write the file stream references to stage file as contents.
      4. Update varOutPutCreated = Y
    3. If ZIP file(contains($varParentFileName,".zip")="true"
      1. Call s3 rest endpoint and configure below:
        1. Resource uri: /{fileName}
        2. Verb: Get
        3. Response as Binary Application/zip
      2. Map the Contents Key to fileName.
      3. IF check unzipFlag = N
        1. Write Zip file ref to stage csv file as contents
        2. Update varOutputCreated as Y
      4. Otherwise
        1. Unzip the zip file using stage
        2. For each unzip file
          1. Write file reference to stage file as contents
          2. Update varOutputcreated as Y
  6. If $varOutputcreated = Y
    1. Read the stage file reference and map the all csv file base64 contents to response
  7. Otherwise
    1. No file, just update varOutputcreated as N
  8. Add scope fault to response exception code, reason and details under default fault handler


Implementation steps (with screenshots):

Create AWS S3 rest connection:



Rest trigger configure:






Assign Globals:


Get list of files from S3:
























































Tuesday, November 21, 2023

OIC - Update OIC lookups based on BI reports invoke and config file

Usecase: Here, We will see the logic steps and implementation how we can call BI reports based on lookups config file stored in S3 bucket and update the OIC lookups like supplier sites,Source Legal entites, GL Source, BU Ledgers etc.

Config file is a csv file where each line contains BI report paths to fetch the look up values.

Logic steps:

From Main Integration , named : OIC_Lookups_Update_D

  1. Take a body scope and assign globals
  2. Call child integration to get the config file from AWS S3 bucket. (https://soalicious.blogspot.com/2023/11/oic-get-files-from-aws-s3-rest-service.html)
  3. Loop over the config files
    1. Decode base64 to file reference
    2. Read file reference
    3. Loop over each lookup config record
      1. Call BI report using SOAP adapter and pass the absolute report path name.(https://soalicious.blogspot.com/2021/05/oic-invoke-erp-bi-report.html)
      2. Call child integration to store the BI lookup response to S3 bucket (https://soalicious.blogspot.com/2023/12/oic-create-resuable-integration-to.html)
  4. Call child integration and get the files (https://soalicious.blogspot.com/2023/11/oic-get-files-from-aws-s3-rest-service.html)
  5. Loop over the files
    1. For each file, call child integration to update the OIC lookup. (https://soalicious.blogspot.com/2023/11/oic-import-replace-lookup-take-base64.html)




Monday, November 20, 2023

ERP BI - Case Study 1 - Create EText EFT template based BI report | Create etext fixed position based BI Report

Usecase: We will create EFT template based or fixed position based BI report:

EFT template or layout:

https://docs.google.com/document/d/1HzqeSCsFBc8BO9m3bmGsaBXJqPul-a9n/edit?usp=drivesdk&ouid=105651791254983245041&rtpof=true&sd=true

Implementation Steps:

Step1: Create a Data model based on a Sql query:

Navigation Menu >> Tools >> Reports and Analytics >> Browse Catalog >> Create >> Data Model >> Click New Data Set >> SQL Query >> provide the SQl query and Dataset name >> ok >> go to data tab and view the data >> save as sample data >> export the xml data to see the structure and content. >> save the data model.

Sql query used: 

Select INVOICE_ID, INVOICE_NUM,INVOICE_AMOUNT,TO_CHAR(Invoice_date,'DD-MON-RRRR') Invoice_date

FROM AP_INVOICES_ALL

WHERE rownum <101






Step2: Based on the Exported xml data structure, create EFT delimiter based rtf doc and upload it to create a report >> Save >> view report

Note: Use the attached EFT template google drive link and copy and modifiy as per your need. SAVE it as .rtf file













Featured Post

11g to 12c OSB projects migration points

1. Export 11g OSB code and import in 12c Jdeveloper. Steps to import OSB project in Jdeveloper:   File⇾Import⇾Service Bus Resources⇾ Se...