Tuesday, June 18, 2024

OIC Javascript Function - Gen3 - the function definition contains unsupported synstax. The output parameter must be named.

While we are creating a javascript function in OIC Gen 3 version, we have received below error:

the function definition contains unsupported synstax. The output parameter must be named.



The javascript code used:

function replaceString(value, str1,str2){

output = value.replace(new RegExp(str1, "g"), str2)

return output;

}

Solution:

If we declare any variable like output here, we have to also add a var keyword before it.

Modified js code:

function replaceString(value, str1,str2){

var output = value.replace(new RegExp(str1, "g"), str2)

return output;

}


Tuesday, June 11, 2024

OIC ERP - FBDI ImportBulkData using Erpintegration REST API | Get import status using Erpintegration REST API

Usecase: Here, we will see how to call ERP rest API and import FBDI data and also check import status.

Here., we will import supplier data.

References:

Follow below oracle page for Rest api details:

https://docs.oracle.com/en/cloud/saas/financials/24b/farfa/op-erpintegrations-post.html

https://docs.oracle.com/en/cloud/saas/financials/24b/farfa/op-erpintegrations-operationname-get.html

ImportBulkData Operation:

Sample Request body:

{

   "OperationName":"importBulkData",

   "DocumentContent":"UEsDBBQAAAAIAHSsJUWkOHOwEwEAAHgDAAAXAAAAQXBJbnZvaWNlc0ludGVyZmFjZS5jc3bV0LFOwzAQBuAdiXewOjBdU5/ts+MxNICCIEKEIjEG4SEijUrsCMHT47ZEqpCyIBj6Tzf8urM/JETOORSXV+x88E3nvGerrgmwvMmK2wrGgraWJwIER7VAseAGgvPBdcH1m77xjj3X3Ss7Y6EffAAtSBmEVFhjJKyqHN4AqoeszLP7HID0HEmmljRk5RMU67V7aergIGa8IHE/x1NxPhwngsBjBBdG8jlCAn+XizI/PdlJSDtNNRZQUnzI90f4SLWph5aFiOR637qPSJRK5JBKo8SeqD4kknoutFWpmiCKa39LhAqJ4vp/IlJymmgskP5JNNsaPTauZXcJW9b+3bUtsOs+mW2ljKQolRKpndTn0Ut9AVBLAwQUAAAACAAze2pIXdvnRHQAAACpAAAAEQAAAEFQVEVTVC5QUk9QRVJUSUVTdYrBCoMwEER/Reg1sPQTQgklYFVoatvjNqSwEDYhK4J/b9CrXmYeMy8V9DEA5iwQROBPjOwJo0DGBX8xCBDPiXyFqSAL+okSi9LDx9rH4Co483Tq0r3aVo0k9W36HApu3r6fpu3G3t5Mc9fOvPX3wLjuvQJQSwECFAAUAAAACAB0rCVFpDhzsBMBAAB4AwAAFwAAAAAAAAAAACAAAAAAAAAAQXBJbnZvaWNlc0ludGVyZmFjZS5jc3ZQSwECFAAUAAAACAAze2pIXdvnRHQAAACpAAAAEQAAAAAAAAAAACAAAABIAQAAQVBURVNULlBST1BFUlRJRVNQSwUGAAAAAAIAAgCEAAAA6wEAAAAA",

   "ContentType":"zip",

   "FileName":"APTEST_0310.zip",

   "DocumentAccount":"fin$/payables$/import$",

   "JobName":"oracle/apps/ess/financials/payables/invoices/transactions,APXIIMPT",

   "ParameterList":"#NULL,Vision Operations,#NULL,#NULL,#NULL,#NULL,#NULL,INVOICE GATEWAY,#NULL,#NULL,#NULL,1,#NULL",

   "CallbackURL":"#NULL",

   "NotificationCode":"10",

   "JobOptions":"InterfaceDetails=1,ImportOption=Y ,PurgeOption = Y,ExtractFileType=ALL"

}

Sample Response body:

{

   "OperationName":"importBulkData",

   "DocumentId":null,

   "DocumentContent":"UEsDBBQAAAAIAHSsJUWkOHOwEwEAAHgDAAAXAAAAQXBJbnZvaWNlc0ludGVyZmFjZS5jc3bV0LFOwzAQBuAdiXewOjBdU5/ts+MxNICCIEKEIjEG4SEijUrsCMHT47ZEqpCyIBj6Tzf8urM/JETOORSXV+x88E3nvGerrgmwvMmK2wrGgraWJwIER7VAseAGgvPBdcH1m77xjj3X3Ss7Y6EffAAtSBmEVFhjJKyqHN4AqoeszLP7HID0HEmmljRk5RMU67V7aergIGa8IHE/x1NxPhwngsBjBBdG8jlCAn+XizI/PdlJSDtNNRZQUnzI90f4SLWph5aFiOR637qPSJRK5JBKo8SeqD4kknoutFWpmiCKa39LhAqJ4vp/IlJymmgskP5JNNsaPTauZXcJW9b+3bUtsOs+mW2ljKQolRKpndTn0Ut9AVBLAwQUAAAACAAze2pIXdvnRHQAAACpAAAAEQAAAEFQVEVTVC5QUk9QRVJUSUVTdYrBCoMwEER/Reg1sPQTQgklYFVoatvjNqSwEDYhK4J/b9CrXmYeMy8V9DEA5iwQROBPjOwJo0DGBX8xCBDPiXyFqSAL+okSi9LDx9rH4Co483Tq0r3aVo0k9W36HApu3r6fpu3G3t5Mc9fOvPX3wLjuvQJQSwECFAAUAAAACAB0rCVFpDhzsBMBAAB4AwAAFwAAAAAAAAAAACAAAAAAAAAAQXBJbnZvaWNlc0ludGVyZmFjZS5jc3ZQSwECFAAUAAAACAAze2pIXdvnRHQAAACpAAAAEQAAAAAAAAAAACAAAABIAQAAQVBURVNULlBST1BFUlRJRVNQSwUGAAAAAAIAAgCEAAAA6wEAAAAA",

   "FileName":"APTEST_0310.zip",

   "ContentType":"zip",

   "FileType":null,

   "DocumentAccount":"fin$/payables$/import$",

   "Comments":null,

   "ProcessName":null,

   "LoadRequestId":null,

   "JobPackageName":null,

   "JobDefName":null,

   "ReqstId":"14258",

   "RequestStatus":null,

   "JobName":"oracle/apps/ess/financials/payables/invoices/transactions,APXIIMPT",

   "ParameterList":"#NULL,Vision Operations,#NULL,#NULL,#NULL,#NULL,#NULL,INVOICE GATEWAY,#NULL,#NULL,#NULL,1,#NULL",

   "NotificationCode":"10",

   "CallbackURL":"#NULL",

   "JobOptions":"InterfaceDetails=1,ImportOption=Y ,PurgeOption = Y,ExtractFileType=ALL",

   "StatusCode":null,

   "ESSParameters":null,

   "links":[

      {

         "rel":"self",

         "href":"https://servername.fa.us2.oraclecloud.com/fscmRestApi/resources/11.13.18.05/erpintegrations/importBulkData",

         "name":"erpintegrations",

         "kind":"item"

      },

      {

         "rel":"canonical",

         "href":"https://servername.fa.us2.oraclecloud.com/fscmRestApi/resources/11.13.18.05/erpintegrations/importBulkData",

         "name":"erpintegrations",

         "kind":"item"

      }

   ]

}


GetEssJobStatus Operation

Sample Response body:

{ "items": [ { "OperationName": "getESSJobStatus", "DocumentId": null, "DocumentContent": null, "FileName": null, "ContentType": null, "FileType": null, "DocumentAccount": null, "Comments": null, "ProcessName": null, "LoadRequestId": null, "JobPackageName": null, "JobDefName": null, "ReqstId": "14557", "RequestStatus": "SUCCEEDED", "JobName": null, "ParameterList": null, "NotificationCode": null, "CallbackURL": null, "JobOptions": null, "StatusCode": null, "ESSParameters": null, "links": [ { "rel": "self", "href": "https://servername.fa.us2.oraclecloud.com/fscmRestApi/resources/11.13.18.05/erpintegrations/getESSJobStatus", "name": "erpintegrations", "kind": "item" }, { "rel": "canonical", "href": "https://servername.fa.us2.oraclecloud.com/fscmRestApi/resources/11.13.18.05/erpintegrations/getESSJobStatus", "name": "erpintegrations", "kind": "item" } ] } ], "count": 1, "hasMore": false, "limit": 25, "offset": 0, "links": [ { "rel": "self", "href": "https://servername.fa.us2.oraclecloud.com/fscmRestApi/resources/11.13.18.05/erpintegrations", "name": "erpintegrations", "kind": "collection" } ] }


Implemented steps:

  1. Create REST API connection using ERP rest URL and provide security as basic autentication : user name and password.
  2. First list the files from a ftp directory using FTP connection
  3. Put a for each action to repeat for each list files.
  4. Read file using ftp adapter connection. Provide a sample csv file. Map the currentFile >> directory and filename to ftp read request.
  5. Take a stage write supplier fbdi tample format file. Filename: PozSupplierint.csv. sample csv file C1 to C156. Map the required columns.
  6. Zip the file using stage. PozSupplierInt.zip
  7. Configure rest call for import bulk data.
    1. Rest URI: /fscmRestApi/resources/11.13.18.05/erpintegrations
    2. Verb: post
    3. Configure request and response json files.
    4. Map the required values.
  8. Assign a status variable with value "NOT_SUCCEEDED".
  9. While loop to check whether the import status success. Condition : $status != "SUCCEEDED"
  10. Configure REST endpoint to check the import status:
    1. URI:  /fscmRestApi/resources/11.13.18.05/erpintegrations
    2. Verb: GET
    3. Add a query parameter: finder
    4. Provide a json payload for response.
  11. Map the import bulk data response >> Reqst Id to finder 
    1. Finder = concat("ESSJobStatusRF;requestId=",ReqstId)
  12. Update status = RequestStatus

Detailed screenshots:  

Create Rest connection:



Integration flow:



List file




For each file


Read file






Write FBDI file







Zip file



Import bulk data










Get ess job status check.








Testing:





Notes:
 
1. while invoking for importBulk data, we need to keep in mind that interface details number = 24 for supplier import else it will not find the proper supplier control file in ERP.

2. Supplier fbdi template:

3. Suppllier source file sample:
SupplierSource.csv

Action,SupplierName,SupplierNumber,AlternateName,OrgType,InactiveDate,SupplierType,Relationship,Permanent,Country,TaxID
CREATE,3M,Company123,3221,3M,Corporation,09-10-2014,Supplier,SPEND_AUTHORIZED,N,US,410417775

4. We can also do other operations like uploadfiletoUCM, submitESSJobRequest,exportbulkdata, purge interface table etc.



Friday, June 7, 2024

OIC - There are large number of scheduled orchestrated integrations need to run at the same time. What we can do to avoid backlogged or queued integrations ?

When there are too many scheduled integrations configured, instances can get backlogged waiting for resources to become available or previous integration runs to complete. This can cause processing delays where some instances are in a waiting state longer than they should be and schedules may not start at the scheduled time.

If we absolutely require a large number of scheduled integrations,  recommend the following design changes as a solution:

  1. For each scheduled integration, convert it to an application-driven orchestrated, REST Adapter trigger-based integration. 
  2. Create a new scheduled integration that only performs an asynchronous invoke of the application-driven orchestrated integration you converted in step 1 above.

Benefits of the design:

This solution enables the scheduled integration to start at the scheduled time, invoke the REST Adapter trigger-based child integration asynchronously, and complete within milliseconds. This approach reduces the backlog and contention surrounding scheduler resources.


If you have a large number of scheduled integrations to convert, a staged approach starting with the following integrations is recommended.
  1. Longest-running scheduled integrations.
  2. Scheduled integrations configured with the shortest frequency (for example, integrations that run every 10 minutes or less).

In the above screenshot,  I have created one schedule integration which is calling app driven integration where i have intentionally put 5 mins of wait. When I run the schedule one, it completes in mili second , where as app driven is still running.

Reference:

https://docs.oracle.com/en/cloud/paas/integration-cloud/integrations-user/common-integration-style-pitfalls-and-design-best-practices.html

Tuesday, June 4, 2024

OIC - ATP - Polling data

Usecase: Here, we will see how to poll data using ATP database.

Note:

Can we poll data as trigger using ATP DB:

  • For direct connections (configured without using the connectivity agent), the Oracle Autonomous Transaction Processing Adapter can only be used for making outbound invocations as an invoke connection. This type of Oracle Autonomous Transaction Processing Adapter-specific connection cannot be configured as a trigger connection in an integration. Use cases related to inbound polling must be implemented using the scheduled orchestration integration pattern.
  • If we create or configure ATP connection using Connectivity agent, then only we can poll data as trigger  from ATP Database.


Detailed screenshots:

Step1: create ATP connection using  connectivity agent



Step2: take an appdriven integration and configure the ATP poll for new or changed records as logical delete on STATUS field.




Put a logger


Insert some data in the db table


From Monittoring track instances


After polled:


Note : If we create direct atp connection without connectivity agent, then during trigger configuration, it will not allow you for the polling setups cause the feature is only supported with connectivity agent.





Featured Post

11g to 12c OSB projects migration points

1. Export 11g OSB code and import in 12c Jdeveloper. Steps to import OSB project in Jdeveloper:   File⇾Import⇾Service Bus Resources⇾ Se...