Usecase: Here, the requirement is, we will have a source file which contains sequential duplicate transaction based lines or records. Based on sequential unique transaction numbers we have to fetch unique lines and write header file.
For instance,
Source file:
Transaction_Identifier,Transaction_number,Event_Type_Code,Ledger_Name,Date
Line,10,Payment,XX,08/09/2022
Line,10,Payment,XX,08/09/2022
Line,11,Reverse,XX,08/09/2022
Line,12,Payment,XX,08/09/2022
Line,12,Payment,XX,08/09/2022
Line,13,Payment,XX,08/09/2022
Line,13,Payment,XX,08/09/2022
Line,14,Payment,XX,08/09/2022
Expected output header file:
Transaction_number,Event_Type_Code,Ledger_Name,Date
10,Payment,XX,08/09/2022
11,Reverse,XX,08/09/2022
12,Reverse,XX,08/09/2022
13,Payment,XX,08/09/2022
14,Payment,XX,08/09/2022
Logic steps:
- Using FTP adapter connection, download the file to stage location and then read the file reference using sample csv filw format.
- Assign preTransactionNumber as empty string("").
- Write the header field names
- For each ledger line or record,
- Assign the current transaction number to variable transactionNumber
- Take a switch and put condition as preTransactionNumber != transactionNumber
- If its true, write the ledger record to header file to that stage location
- Otherwise, skip the duplicate line
- Assign the preTransactionNumber = transactionNumber
- After the loop, read the entire header file from the stage location
- Write the file to a FTP location.
Steps with screenshots:
Download the file