Monday, April 25, 2022

OIC - Call DB package insert operation | one to one database insert using Package

Use Case: we will create a simple package and using OIC integration, insert the customer records one by one to the database.

Highlevel steps:

  1. Create a customer table and package to insert customer records
  2. Rest trigger to accept customer records.
  3. Call package using db adapter
  4. For each customer record, insert to database.

Create table:

Create table customer1(

Id varchar2(20),

Name varchar2(200),

Age number,

Address varchar2(200));


Package Specification:

CREATE OR REPLACE PACKAGE PCK_CUSTOMER AS

PROCEDURE addCustomer(c_id customer1.id%type,

c_name customer1.Name%type,

c_age customer1.age%type,

c_addr customer1.address%type);

END PCK_CUSTOMER;

/

Package Body:

CREATE OR REPLACE PACKAGE BODY PCK_CUSTOMER AS

PROCEDURE addCustomer(c_id customer1.id%type,

c_name customer1.Name%type,

c_age customer1.age%type,

c_addr customer1.address%type)

IS

BEGIN

INSERT INTO CUSTOMER1(ID NAME,AGE,ADDRESS) VALUES(c_id, c_name, c_age, c_addr);

END addCustomer;

END PCK_Customer;

/

Rest trigger request json:

{

"Customers":{

"Customer":[

{

"Id":1,

"Name":"aaa",

"Age":12,

"Address":"qqq"

},

{

"Id":2,

"Name":"aaa",

"Age":12,

"Address":"qqq"

}]

}}



Steps in detail:

Integration flow:


Rest trigger configuration:







For each customer


Invoke  a stored procedure.




Map customer to db target fields.



Thursday, April 21, 2022

DB Offset - Fetch clause

Offset and Fetch clauses are used in conjunction with Select and order by clause to provide a means to retrieve a range of records.

Offset: the offset argument is used to identify the starting point to retirn rows from a result set.

Fetch: It is used to return a set of number of rows.

Syntax:

Select column_name(s)

From table_name

Order by column_name

Offset rows_to_skip ROWS

Fetch Next number_of_rows ROWS ONLY;


Example:

Suppose you have 9 rows in a db table and you want to fetch 3rd to 7th one

Select orderNumber,Customer

From orders

Order by orderNumber

Offset 2 ROWS

Fetch Next 5 Rows ONLY;


OIC - DB Procedure call | DB adapter 10 MB limit overcoming solution

Use Case: If we run a db select query and it has humungous data (more than 10MB) then it will fail as it has reached the limit. We can avoid such error using  SQL with offset. In this use case, we will use a Stored procedure.

To know about Offset & Fetch:

https://soalicious.blogspot.com/2022/04/db-offset-fetch-clause.html

Highlevel steps:

  1. Create a db table orders.
  2. Create a offset select db query
  3. Wrap the sql query in a stored procedure.
  4. In the integration, 
    1. 1st we take the total row count using db adaptet call
    2. Determine number of iteration - ceiling(totalRows / batch size)
    3. For each iteration,
      1. We will select the rows by offset
      2. For each row, we will log the row details
      3. Increment the counter and offset.
Steps in detail:

Create a table orders:

Create table orders(
"orderNumber" VARCHAR2(20 BYTE),
"Customer" VARCHAR2(200 BYTE)
)

Select offset db query:

Select "orderNumber","Customer" from orders order by "orderNumber"
Offset 2 rows 
Fetch Next 3 rows only

Sql query wrapped up in stored procedure:

CREATE OR REPLACE PROCEDURE OrdersByOffset(in_fetchSize NUMBER, in_offset NUMBER, p_Orders out SYS_REFCURSOR)
IS
BEGIN
OPEN p_Orders for
SELECT "orderNumber","Customer" 
FROM orders
ORDER BY "orderNumber"
OFFSET in_offset ROWS
FETCH NEXT in_fetchSize ROWS ONLY;
END;
/

Create a scheduled orchestration and count total rows from the dp adapter using the following sql:

Select count(1) from orders


Total rows fetch db adaptet config:




Create variables for :

totalRows: number of rows fetched from db adapter.

batchSize: number(4.0)

totalIteration: ceiling(orajs29:divide($totalRows,$batchSize))




Next step is to invoke stored procedure by offset to get the batch of rows.


Assign Counter and Offset:

Counter: number(1)

fetchSize: $batchSize

offSet: number(0)



In the while,

$Counter <=$totalIteration



Invoke stored procedure.





Map the fetchSize and Offset.




Now for each fetched row, will log them and increment the counter and offset.


For each row,



Log the row details:

concat("Order Details - OrderNr / Customer :",$currentRow/nsmpr3:Row/nsmpr3:Column[1]," / ",$currentRow/nsmpr3:Row/nsmpr3:Column[2])


Counter: $Counter + 1

offSet : $offSet + $fetchSize



Testing:

In my case, i have totalRows = 9 in the db orders table and batchSize= 4, so total iterations=9 / 4 = 3.








Wednesday, April 20, 2022

OIC | Merge two CSV files into one CSV file based on a common field or primary key

Use Case: Here,we will merge two CSV files into one final file based on a common primary key or common field values.

In the below example, we will use EmpId as primary key in both the CSV files for merging.

EmpDetail.csv

EmpId,FName,LName

101,sid,das

102,dip,chak

103,sri,das

EmpContact.csv

EmpId,Contact

101,101_123456

102,102_234567

101,102_987654

103,103_456789

Merged file:

EmpId,FName,LName,Contact

101,sid,das,101_123456 102_887654

102,dip,chak,102_234567

103,sri,das,103_456789

Highlevel steps:

  1. Read both the CSV files using FTP adapter one by one.
  2. For each emp record of 1st CSV file, we will Map both the files and write the Merge file using FTP adapter. We will use a condition to match the records of the both the file using the EmpId of both the files.

Steps in detail:

Integration flow:


Once both files read, we will loop for each emp record of the 1st file.





Map the current emp details and map the contact using following condition:

"$readEmployeeContact/nsmpr2:SyncReadFileResponse/ns24:FileReadResponse/ns22:EmpContactRecSet/ns22:EmpContactRec[ns22:Empid=$currentEmp/nsmpr1:EmpDataRec/nsmpr1:EmpId]/ns22:Contact"



Test:










Note: instead of taking a for each action, we can also do the same in xslt itself:

Xslt code:

<nstrgmpr:WriteFile xml:id="id_12">
<ns30:Employees>
<xsl:for-each select="$readEmployeeData/nsmpr2:SyncReadFileResponse/ns27:FileReadResponse/ns26:/EmpDataRecSet/ns26:/EmpDataRec">
<xsl:variable name="EmpId" select="ns26:/EmpId"/>
<ns30:Employee>
<ns30:C1>
<xsl:value-of select="ns26:EmpId"/>
</ns30:C1>
<ns30:C2>
<xsl:value-of select="ns26:FName"/>
</ns30:C2>
<ns30:C3>
<xsl:value-of select="ns26:LName"/>
</ns30:C3>
<ns30:C4>
<xsl:value-of select="$readEmployeeContact/nsmpr1:SyncReadFileResponse/ns23:FileReadResponse/ns20:/EmpContactRecSet/ns20:/EmpContactRec[ns20:EmpId=$EmpId]/ns20:Contact"/>
</ns30:C4>
</ns30:Employee>
</xsl:for-each>
</ns30:Employees>
</nstrgmpr:WriteFile>



Tuesday, April 19, 2022

DB - Check the status for all type of objects using user_objects

 Query:

select object_type, object_name, status from user_objects where object_type in ('FUNCTION','PROCEDURE','PACKAGE','PACKAGE BODY','TRIGGER') order by object_type,object_name;




Oracle Integration Cloud | Merge two CSV files into a single file without any primary key or common field

Use Case: Here, we will Merge the content of two files into one csv file where there is no common field.

Consideration: we will map the two files sequencially. That means, for 1st row of file1 to 1st row of file2, 2nd row of file1 to 2nd row of file2......nth row of file1 to nth row of file2.

For example,

File1.csv:

FirstName,LastName

sid,das

dip,chak

File2.csv

EMail

sid@gmail.com

dip@gmail.com

Merged file:

FirstName,LastName

sid,das,sid@gmail.com

dip,chak,dip@gmail.com

Highlevel steps:

  1. Create a scheduled orchestration with a name.
  2. Drag and drop the FTP adapter and read both the source files one by one.
  3. Drag and drop the FTP adapter to write the final merge file. Here, in the map, for each file1 record, we will create a variable named "i" with value "position()" and that postion we will use to fetch the value from file 2 hsing $i.
Steps in detail(with screenshots):

Integration flow


Read file1






Read file2






Write Merge file






Map codes:

<xsl:template match="/" xml:id="id_11">

<nstrgmpr:WriteFile xml:id="id_12">

<ns28:FileMergedRecSet>

<ns28:FileMergedRec>

<ns28:C1>

<xsl:value-of select="'Fname'"/>

</ns28:C1>

<ns28:C2>

<xsl:value-of select="'Lname'"/>

</ns28:C2>

<ns28:C3>

<xsl:value-of select="'Email'"/>

</ns28:C3>

</ns28:FileMergedRec>

<xsl:for-each select="$readFile1/nsmpr1:SyncReadFileResponse/ns21:FileReadResponse/ns22:File1RecSet/ns22:File1Rec">

<xsl:variable name="i" select="position()"/>

<ns28:FileMergedRec>

<ns28:C1>

<xsl:value-of select="ns22:FirstName"/>

</ns28:C1>

<ns28:C2>

<xsl:value-of select="ns22:LastName"/>

</ns28:C2>

<ns28:C3>

<xsl:value-of select="$readFile2/nsmpr2:SyncReadFileResponse/ns25:FileReadResponse/ns24:File2RecSet/ns24:File2Rec[$i]/ns24:EMail"/>

</ns28:C3>

</ns28:FileMergedRec>

</xsl:for-each>

</ns28:FileMergedRecSet>

</nstrgmpr:WriteFile>

</xsl:template>




Featured Post

11g to 12c OSB projects migration points

1. Export 11g OSB code and import in 12c Jdeveloper. Steps to import OSB project in Jdeveloper:   File⇾Import⇾Service Bus Resources⇾ Se...