Wednesday, March 23, 2022

OIC - Managing Multiple Operations in OIC SOAP Adapter

Introduction:

  • OIC SOAP Adapter is an extensively used adapter to expose a soap-based external interface on cloud for many other applications to access. 
  • Today, any integration within OIC uses exposes a single operation in it's interface. 
  • If your implementation needs to support multiple operations, each operation requires a separate integration to be developed with an interface exposed causing the linear complexity in O(n), with n being the number of operations to be exposed.

Problem Statement:
Expose multiple operations using single soap integration. Is this possible?

Analysis and Solution:
Yes, it is certainly possible with a workaround and some limitations.

Workaround steps:

  1. Create one integration per an operation that needs to be implemented (which will be called as child flows from here on).
  2. Create a soap connection that uses the above exposed OIC wsdl and runtime credentials with corresponding security policy (which will be called as child connections from here on).
  3. Repeat 1 & 2 for all required number of operations.
  4. Create a generic wsdl interface and a soap trigger connection to use this. This wsdl interface uses anyType payload with a generic operation name.
  5. Create an integration using above connection as trigger (which will be called as parent flow from here on) that will call the child integrations.
  6. Once the  trigger is configured, parse the payload to identify the operation and identify an unique condition to determine which child to be invoked.
  7. Use switch action with conditions to uniquely identify the child invocations.
  8. Use invoke activity with child connections in each condition
Note: This is not a recommended implementation pattern and just a workaround until OOTB support is provided.

Generic interface for parent flow. The wsdl should support any type of payload:

<definitions xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:wsp="http://www.w3.org/ns/ws-policy" xmlns:wsp1_2="http://schemas.xmlsoap.org/ws/2004/09/policy" xmlns:wsam="http://www.w3.org/2007/05/addressing/metadata" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:tns="http://mock.service.com/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.xmlsoap.org/wsdl/" targetNamespace="http://mock.service.com/" name="EchoMultiOpsImplService">
  <types>
    <xs:schema xmlns:tns="http://mock.service.com/" xmlns:xs="http://www.w3.org/2001/XMLSchema" version="1.0" targetNamespace="http://mock.service.com/">
      <xs:element name="echoAny" type="tns:echoAnyType"/>
      <xs:element name="echoAnyResponse" type="tns:echoAnyType"/>
      <xs:complexType name="echoAnyType">
        <xs:sequence>
          <xs:element name="xml" type="xs:anyType"/>
        </xs:sequence>
      </xs:complexType>
    </xs:schema>
  </types>
  <message name="echoSingle">
    <part name="parameters" element="tns:echoAny"/>
  </message>
  <message name="echoSingleResponse">
    <part name="parameters" element="tns:echoAnyResponse"/>
  </message>
  <portType name="EchoAny">
    <operation name="echoAny">
        <input wsam:Action="http://mock.service.com/echoSingle" message="tns:echoSingle"/>
        <output wsam:Action="http://mock.service.com/echoSingleResponse" message="tns:echoSingleResponse"/>
    </operation>
  </portType>
</definitions>

Converting the entire child node of xml element to a string:

xmlDataAsString = ora:getContentAsString(/nsmpr8:echoAny/xml/*)

Identify the root-element from the child-element of xml tag.

xmlDataRoot = substring-before(substring-after($xmlDataAsString, '<'),'>') 

There can be case where the root element can be one of below types :

<rootElement>
<rootElement xmlns="someTargetNamespace">
<nsprefix:rootElement xmlns:nsprefix="someTargetNamespace">

The value of $xmlDataRoot, we expect here should have only 'rootElement' so we can route it appropriately. So we will add a switch action to handle this and update the variable using assign actions (pseudo-code shown below)

if $xmlDataRoot contains ' '(space) --> substring-before($xmlDataRoot,' ')

if $xmlDataRoot contains : --> substring-after($xmlDataRoot,':')

The final value of $xmlDataRoot at this point will return the value - 'rootElement'.

Now based on the rootelement we will create a switch and call each operartion.


Used xml payload for login operation:

<login>

<AccountNumber>111</AccountNumber>

<UserNane>abc</UserName>

<Password>pwd</Password>

</login>

For Base64 opaque schema, follow my blog:

https://soalicious.blogspot.com/2022/02/xsd-opaque-schema.html

Poc Screenshots:

Create a generic soap connection.


Create a parent integration and use generic wsdl connection as trigger.





Convert the xml child data into string


Fetch the xml root element(which is the opetration name).




Take a switch and add cases for each operation like here $xmlRootData ="login"


In each switch case, we need to convert the xmldata as string to xml data using write file with base64 and read file with xml payload.










Now call the child integration and map the read file content to child elements.



Convert the response to string.
 

Map the string response back to the caller.


Flow diagram with 1 operarion in switch. You can add multiple switch cases for other operations.

Tuesday, March 15, 2022

ERP BI sql query to fetch output based on ESS REQUEST HISTORY Table last run date | Fetch Processstart from ESS_Request_History table

Usecase: Here using ess job tables - ESS_request_history and ESS_request_property, we will fetch the latest succeeded processstart data as last_run_date and then compare this last_run_date with the last_update_date from the xla_trsansaction_headers.

Same BI sql query to fetch the last run date latest data from ERP db tables:

Example1:

WITH ess_requests
AS (select NVL(MAX(erh.processstart), NULL) last_run_date FROM fusion.ess_request_history erh, fusion.ess_request_property erp1
Where erh.executable_status = 'SUCCEEDED'
AND erp1.requestid = erh.requestid
AND erp1.name ='reportID'
AND erp1.VALUE = '/Custom/Integrations/Poc/SampleREP.xdo'
AND erh.submitter = 'SVC_OIC_ERP')

Select '1' super_group
,xth.transaction_number
,xal.sr8 policy number
,gcc.segement1
,gcc.segment2
...........
From xla_ae_lines xal
,xla_ae_headers xah
,xla_events xe
,xla_transaction_entities xte
,xla_transaction_headers xth
 ,gl_code_combinations gcc
,xla_subledgers
,xla_event_types_b xet
,ess_requests er

Where 1=1
And xte.entity_id =xe.entity_id
And xte.application_id = xe_application_id
And xe.event_id =xah.event_id
And xah.ledger_id = xal.ledger_id
...........
And xth.last_update_date > NVL(er.last_run_date, SYSDATE - 2)

Example2:

With jobrun as 
(Select (
Select nvl(MAX(erh.processstart), sysdate-1)
From fusion.ess_request_history erh
Where 1=1
And upper(erh.definition) like '%ESS_Report_Name%'
And erh.executable_status = 'SUCCEEDED'
AND erh.submitter = 'SVC_OIC_ERP') last_run
From dual)

Select distinct cc.segment1
,cc.segment2
,jeh.description
,jeh.status
........
From gl_period_statuses ps
,gl_je_headers jeh
,gl_je_lines jel
,gl_je_sources src
,gl_code_combinations cc
,gl_je_batches jeb
,xla_ae_lines xal
,xcla_ae_headers xah
,jobrun
........
Where 1=1
And ps.application_id =101
,ps.ledger_id = jeh.ledger_id
And jeh_period_name =ps.period_name
.....
And ((jeh.last_update_date) >=jobrun.last_run
Or (jel.last_update_date) >=jobrun.last_run
Or (jeb.last_update_date) >=jobrun.last_run
Or (src.last_update_date) >=jobrun.last_run
Or (xah.last_update_date) >=jobrun.last_run
Or (xal.last_update_date) >=jobrun.last_run)



Oracle Fusion ess job definition tables | ERP ESS job Schedule processes tables

Following are the oracle fusion job schedule processes tables which helps to extract the submitted ESS job requests:

  • ESS_Request_History
  • ESS_Request_Property
  • Request_History
  • Request_History_view


Here below is the Detail SQL query for the ESS job Schedule processes:

Select r.requestid, 

r.name "ESS Job Name",

r.EXECUTABLE_STATUS "JOB  STATUS",

R.USERNAME,

R.APPLICATION,

R.PRODUCT,

R.REQUESTEDSTART,

R.COMPLETEDTIME 

from fusion.ess_request_history r,fusion.ess_request_property v

where 1=1  and r.requestid = v.requestid
order by r.requestid, v.name


BI SQL Query sample to read data based on the last run date from the ess request history table:

https://soalicious.blogspot.com/2022/03/erp-bi-sql-query-to-fetch-output-based.html


Monday, March 14, 2022

OIC ERP BI Bursting using FTP delivery channel

UseCase : We will create a Bursting BI report and create a ESS job over the bursting BI and call the ESS job from OIC and save the report in the FTP delivery channel. HERE, we will create one document/single file.

Please follow my previous blog for BI Bursting about and details:

https://soalicious.blogspot.com/2022/03/oic-erp-bi-bursting.html

Bursting Query:

SELECT 'XX' AS "KEY",

'XXSupplierRecord' AS "TEMPLATE",

'en-US' AS "LOCALE",

'TEXT' AS "OUTPUT_FORMAT",

'FTP' AS "DEL_CHANNEL",

'AmazonS3' AS "PARAMETER1",

:p_destination AS "PARAMETER4",

'XXSUPPLIERREPORT_' || TO_CHAR(SYSDATE, 'YYYY-MM-DD_HHMISS') || '.csv'

AS "PARAMETER5",

'true' AS "PARAMETER6"

FROM DUAL

BI report Query:

Select 'XX' AS "KEY",

hp.party_name supplier_name,

poz.segment1 supplier_number,

poz.vendor_id supplier_id,

hp.email_address,

hp.country,

hp.city,

hp.status

from POZ_SUPPLIERS poz, HZ_PARTIES hp where 1=1 

And poz.party_id = hp.party_id

And rownum<=10;

Create BI Bursting Report:

Navigator >> Tools >> Reports and Analytics >> Browse catalog >> traverse Shared Folders -- Custom -- Integrarions -- Poc >> 














Create ESS job:

Navigator >> My Enterprise >> Setup and Maintenance >> search with Manage Enterprise in search tasks



Provided following details:
Name: XXSupplierREP_ESS
Path: /delta/
Application: Application Toolkit
Description: POC bursting report
Job Application name: FscmEss
JOb type: BIPJobType and bursting report
Report id: /Custom/Integratikns/Poc/XXSpplierREP1.xdo
Allow multiple pending submissions: True

Create a Paramter: p_destination.


Call From OIC:

JobPackageName: oracle/apps/ess/custom/delta

JobDefinitionName: XXSupplierREP_ESS

Parameter: Sftp path to save the file.














ERP delivery configuration:

Navigation # Tools >> Reports and Analytics >> Browse Catalog >> Administration 






Bursting for multiple files:



Sample bursting query for sending email:

Select a.Vendor_id KEY,

'Test_supp' TEMPLATE,

'en-US' LOCALE,

'PDF' OUTPUT_FORMAT,

'Supplier Details' OUTPUT_NAME,

'EMAIL' DEL_CHANNEL,

'test@gmail.com' PARAMETER1,

'test@gmail.com' PARAMETER2,

'bipublisher-report@oracle.com' PARAMETER3,

'Supplier Details' PARAMETER4,

'Please find the attached file' PARAMETER5,

'true' PARAMETER6,

'donotreply@oracle.com' PARAMETER7,

From poz_suppliers a


For more details follow oracle docs:

https://docs.oracle.com/cd/E21764_01/bi.1111/e18862/T527073T555155.htm#:~:text=About%20Bursting,-Bursting%20is%20a&text=For%20each%20block%20of%20the,element%20in%20the%20data%20model.



OIC ERP About BI Bursting

 About Bursting

  • Bursting is a process of splitting data into blocks, generating documents for each block, and delivering the documents to one or more destinations.
  •  The data for the report is generated by executing a query once and then splitting the data based on a "Key" value. For each block of the data, a separate document is generated and delivered.
  • Using BI Publisher's bursting feature you can split a single report based on an element in the data model and deliver the report based on a second element in the data model.
  •  Driven by the delivery element, you can apply a different template, output format, delivery method, and locale to each split segment of your report. Example implementations include:
• Invoice generation and delivery based on customer-specific layouts and delivery preference

• Financial reporting to generate a master report of all cost centers, splitting out individual cost center reports to the appropriate manager

• Generation of pay slips to all employees based on one extract and delivered via e-mail

What is the Bursting Definition?

A bursting definition is a component of the data model. After you have defined the data sets for the data model, you can set up one or more bursting definitions. When you set up a bursting definition, you define the following:
  • The Split By element is an element from the data that will govern how the data is split. For example, to split a batch of invoices by each invoice, you may use an element called CUSTOMER_ID. The data set must be sorted or grouped by this element.
  • The Deliver By element is the element from the data that will govern how formatting and delivery options are applied. In the invoice example, it is likely that each invoice will have delivery criteria determined by customer, therefore the Deliver By element may also be CUSTOMER_ID.
  • The Delivery Query is a SQL query that you define for BI Publisher to construct the delivery XML data file. The query must return the formatting and delivery details.
Defining the Query for the Delivery XML

The bursting query is a SQL query that you define to provide BI Publisher with the required information to format and deliver the report. BI Publisher uses the results from the bursting query to create the delivery XML.

The BI Publisher bursting engine uses the delivery XML as a mapping table for each Deliver By element. The structure of the delivery XML required by BI Publisher is as follows:

<ROWSET> <ROW> <KEY></KEY> <TEMPLATE></TEMPLATE> <LOCALE></LOCALE> <OUTPUT_FORMAT></OUTPUT_FORMAT> <DEL_CHANNEL></DEL_CHANNEL> <TIMEZONE></TIMEZONE> <CALENDAR></CALENDAR> <OUTPUT_NAME></OUTPUT_NAME> <SAVE_OUTPUT></SAVE_OUTPUT> <PARAMETER1></PARAMETER1> <PARAMETER2></PARAMETER2> <PARAMETER3></PARAMETER3> <PARAMETER4></PARAMETER4> <PARAMETER5></PARAMETER5> <PARAMETER6></PARAMETER6> <PARAMETER7></PARAMETER7> <PARAMETER8></PARAMETER8> <PARAMETER9></PARAMETER9> <PARAMETER10></PARAMETER10> </ROW> </ROWSET>

where

• KEY is the Delivery key and must match the Deliver By element. The bursting engine uses the key to link delivery criteria to a specific section of the burst data.

• TEMPLATE - is the name of the Layout to apply. Note that the value is the Layout name (for example, 'Customer Invoice'), not the template file name (for example, invoice.rtf).

• LOCALE - is the template locale, for example, 'en-US'.

• OUTPUT_FORMAT - is the output format. For a description of each type, see Setting the Output Types. Valid values are: HTML, PDFZ - zipped PDF, PDF, RTF, EXCEL, EXCEL2000, MHTM, PPT, PPTX - PowerPoint 2007, XSLFO, XML, CSV, ETEXT

• SAVE_OUTPUT - indicates whether to save the output documents to BI Publisher history tables that the output can be viewed and downloaded from the Report Job History page.

Valid values are 'true' (default) and 'false'. If this property is not set, the output will be saved.

DEL_CHANNEL - is the delivery method. Valid values are:
  • EMAIL
  • FAX
  • FILE
  • FTP
  • PRINT
  • WEBDAV
Delivery Channel & PARAMETER Values:

Email
PARAMETER1: Email address
PARAMETER2: cc
PARAMETER3: From
PARAMETER4: Subject
PARAMETER5: Message body
PARAMETER6: Attachment value ('true' or 'false'). If your output format is PDF, you must set this parameter to "true" to attach the PDF to the e-mail.
PARAMETER7: Reply-To
PARAMETER8: Bcc
(PARAMETER 9-10 are not used)

File
PARAMETER1: Directory
PARAMETER2: File Name
(PARAMETER 3-10 are not used)

FTP and SFTP
PARAMETER1: Server name
PARAMETER2: Username
PARAMETER3: Password
PARAMETER4: Remote Directory
PARAMETER5: Remote File Name
PARAMETER6: Secure (set this value to 'true' to enable Secure FTP)
(PARAMETER 7-10 are not used)

Follow below blog for steps to create bursting bi report and call from ERP:

Wednesday, March 9, 2022

OIC ERP BI publisher call Retry logic Usecase 2

Use Case: 

We have OIC to ERP Bulk import Integration and once import gets completed, we are calling another sub-integration to get the callback from ERP. 

To get callback, we are calling BI publisher report in the subintegration and place the callback file in the SFTP directory. 

In production, we are observing 5 mins time limit exceeded error while invoking the client ERP BI service and integration gets failed and we are not receiving the Callback file. As a result, we need to manually generate the callback file from ERP for further processing.

What we do:

  • One - Retry the ERP callback call atleast 5 times before it finally gets failed.
  • Two - In every retry, we have to check if we receive the callback file in the SFTP server directory. That file also check loops five times.

Integration flow:



Steps in detail:

Step1: Take a assign and create a Variable ControlReportExists

p_controlReportExist : false

p_ExceptionCode: ""

p_ExceptionReason: ""

p_ExceptionDetails: ""

p_callBackCount: 0.0


Step2: Take a while and loop it for five times.

$p_controlReportExist: "false" and $p_callBackCount <=4.0


Step3: Take a scope and call the callback subintegration and assign the customized Exception code, reason and details.


Step4: Take a assign for initiate file check count

p_FileCheckCount =0


Step5: Take a while loop and loop it for 5 times to check the file exist in the SFTP directory.

$p_FileCheckCount <=4.0 and $p_controlReportExist="false"


Step6: Call FTP with list operation and required File pattern and direcrory details.



Step7: Take a switch and 

if ItemCount =0,

Assign,

p_controlReportExist= "false"

p_FileCheckCount=$p_FileCheckCount + 1.0

Wait 59 sec

Else,

p_controlReportExist= "true"


Step8: take a assign and increase the callback counter by 1.

p_callbackCount =$p_callbackCount + 1.0


Conclusion:

This retry logic will save manual work and tine and most cases(99%), it will work smoothly and gets the callback file.




Featured Post

11g to 12c OSB projects migration points

1. Export 11g OSB code and import in 12c Jdeveloper. Steps to import OSB project in Jdeveloper:   File⇾Import⇾Service Bus Resources⇾ Se...