Wednesday, November 19, 2025

OIC - Using Dynamic Paths for OCI Object Storage in OIC Integrations | Handling “Subfolders” in OCI Object Storage bucket

Use Case

In Oracle Integration Cloud (OIC), when working with Oracle Object Storage, “folders” inside a bucket are not real directories. They are simply part of the object name (key).
Because of this, OIC cannot dynamically navigate subfolders the same way as SFTP or FTP directories.

So when an integration requires:

  • Upload Path
  • Archive Path
  • Error/Reject Path
  • Dynamic Environment-specific Paths (DEV/TEST/PROD)

…you must pass these paths as string fields and build the full object name by concatenating path + filename.

This approach is required to handle dynamic folder structures, avoid hardcoding paths, and support multiple environments using a single integration.


Solution Overview

To manage folder-like object prefixes correctly in OIC:

  1. Create lookup/variables for folder paths (upload, archive, error).
  2. Pass these as integration input or derive from a lookup.
  3. Concatenate the path with the file name to create the full object key.
  4. Use the Object Storage adapter with the generated key (object name).
  5. Avoid assuming physical directories—treat each folder as a prefix.

Solution Steps

Step 1: Create Path Fields

Create the following fields in your integration or lookup:

  • BucketName - bkt-dev-payment
  • uploadPath – e.g., /incoming/employee/ or / for no subfolder.
  • archivePath – e.g., /archive/employee/ or / for no subfolder
  • errorPath – e.g., /error/employee/ or / for no subfolder.

Ensure each ends with a trailing slash.


Step 2: Accept or Fetch File Name

From the source system or from the file adapter, capture the file name:

Example:
EMPLOYEE_20251119.csv


Step 3: Construct the Object Key

Use an assign activity in OIC:

fullObjectName = uploadPath || fileName

Examples:

incoming/employee/EMPLOYEE_20251119.csv
archive/employee/EMPLOYEE_20251119.csv

This string is the actual object key in Object Storage.


Step 4: Configure Object Storage Adapter

In the adapter configuration:

  • Select "Specify Object Name"
  • Map it to the concatenated value (fullObjectName)

No directory browsing is required because subfolders are just text prefixes.


Step 5: Write or Read the Object

Use the adapter normally — OIC will treat the full object key as the complete path.

  • Write → Upload to path-like object key
  • Read → Fetch file using the exact key
  • Move/Archive → Upload the same file under the archive key and delete the original

Step 6: Repeat for Archive and Error Handling

During archiving:

archiveObjectName = archivePath || fileName

During error processing:

errorObjectName = errorPath || fileName

Solution with screenshots:

Upload/archive/getfile:
URI: /n/{namespaceName}/b/{bucketName}/o/{subfolder}{objectName}
Map the subfolder path from lookup:
dvm:lookupValue("<LookupName>_Common_CCS_Interface_SFTP_Lookup",
                "IntegrationId",
                $Var_InterfaceId,
                "ArchiveBucketSubfolder",
                "/")




List files:
Create a variable and store the bucket sub folder path 
fn:substring-after(
    dvm:lookupValue(
        "<LookupName>_Common_CCS_Interface_SFTP_Lookup",
        "IntegrationId",
        $Var_InterfaceID,
        "BucketSubfolder",
        "/"
    ),
    "/"
)
Add the bucket subfolderpath as prefix to the object name
concat(
    Var_BucketSubfolderPath,
    replace(
        lookupValue(
            "<LookupName>_Common_CCS_Interface_SFTP_Lookup",
            "IntegrationId",
            Var_InterfaceID,
            "SourceFileName",
            "NA"
        ),
        "YYYYMMDD",
        concat(
            substring(FileProcessingDate, 3.0, 2.0),
            substring(FileProcessingDate, 5.0, 2.0),
            substring(FileProcessingDate, 7.0, 2.0)
        )
    )
)






Rename file:
This is same as list files, need to add the subfolder prefix to the object name.




Final Note

This approach is the recommended design pattern because Object Storage does not support directory navigation. Treating paths as string prefixes ensures full flexibility and environment portability in all OIC integrations.


Sunday, November 16, 2025

Microsoft Excel - Pivot Tables

 Working...

1. Understanding Excel PivotTables

A PivotTable is a tool that summarizes, analyzes, and presents large amounts of data quickly.
It helps you group, filter, total, count, and compare data without writing formulas.


2. Creating an Excel PivotTable

  1. Select your data range (with headers).
  2. Go to Insert → PivotTable.
  3. Choose where to place the PivotTable (new or existing worksheet).
  4. Drag fields into:
    • Rows → categories
    • Columns → comparison groups
    • Values → totals/counts
    • Filters → top-level filter





3. Modifying Excel PivotTable Calculations

You can change the calculation in Values:

  • Click the field → Value Field Settings
  • Choose Sum, Count, Average, Max, Min, % of total, etc.

Useful for switching from SUM to COUNT or AVERAGE easily.


4. Grouping PivotTable Data

You can group:

  • Dates → months, quarters, years
  • Numbers → group into ranges (e.g., 0–100, 101–200)
  • Text → manually group selected items

Right-click → Group.


5. Formatting PivotTable Data

You can format:

  • Numbers (currency, percentage, comma style)
  • Layout (Compact/Outline/Tabular)
  • Design (PivotTable Styles)
  • Remove/Show Grand Totals & Subtotals

Go to PivotTable Design and Layout options.


6. Modifying PivotTable Calculations (Repeated Topic)

This includes:

  • Changing the summary function
  • Calculating % of row, % of column, % of grand total
  • Showing difference from another value
  • Adding calculated fields (PivotTable Analyze → Fields → Calculated Field)

7. Drilling Down into PivotTable Data

To see detailed records:

  • Double-click any number in the PivotTable
    Excel creates a new sheet showing the underlying rows that make that number.

8. Creating PivotCharts

  1. Click anywhere inside the PivotTable
  2. Go to Insert → PivotChart
  3. Choose chart type (Column, Line, Pie, Bar, etc.)

PivotCharts update automatically when the PivotTable is refreshed.


9. Filtering PivotTable Data

You can filter using:

  • Filters area (top-level filter)
  • Row/Column labels filter (label, value filters)
  • Search filters
  • Date filters

10. Filtering with the Slicer Tool

A Slicer is a visual filter button set.

To add a slicer:

  1. Click PivotTable
  2. Go to PivotTable Analyze → Insert Slicer
  3. Select fields for filtering
  4. Click the slicer buttons to filter instantly



Wednesday, November 12, 2025

OIC - Automating GL FBDI Import and Monitoring with OIC Callback, Twilio Email, and DataDog

Use Case Overview

The objective of this integration is to automate the General Ledger (GL) FBDI file import process from Oracle Object Storage into Oracle ERP Cloud using Oracle Integration Cloud (OIC).

The solution not only performs the end-to-end data load but also:

  • Monitors the import execution status through OIC callback,
  • Sends detailed status emails using Twilio Email Service, and
  • Logs any failed import details into DataDog for proactive monitoring and troubleshooting.

Note: The source system provides the CSV file in an FBDI-compatible format, so we do not recreate or generate the FBDI CSV file within the integration.


Integration Architecture Overview

The entire flow is structured into three orchestrated integrations:

1. Scheduler Integration

  • Runs on a predefined schedule (e.g., hourly or daily).
  • Triggers the Main Integration that performs the file listing and orchestration

2. Main Integration

  • Connects to Oracle Object Storage and lists available files in the input bucket (pending GL extracts).
  • For each file found, it:
    • Passes the file name and metadata to the Child Integration using a For-Each loop.
    • Ensures parallel or sequential invocation as per configuration (typically sequential for control).

3. Child Integration

  • Responsible for the core business logic of importing the GL data into ERP.
  • Steps include:
    1. Download the source file from Object Storage.
    2. Generate a properties file (metadata required for import).
    3. Create a zip file combining the data file and the properties file — this forms the FBDI.zip.
    4. Invoke ERP Bulk Import Service (Load Interface File for Import) to upload the FBDI.zip.
    5. Archive the FBDI zip file to internal OIC folder for support help.
    6. Rename the processed file with Processed_ prefix.
    7. If any error, logs to datadog from global fault.



Properties file map: 

ERP bulk import:




Callback Integration – Monitoring and Notification

The callback performs the following major actions:

Receive business events:

Once ERP finishes processing the submitted FBDI file—regardless of success or failure—we will subscribe to the ERP Integration Inbound Event and apply a filter condition to identify and process only the corresponding GL import event.

Assign required details

Assign the required details into the variable so that we can use them further.


Implement a Switch action to differentiate between Success, Load Error, and Import Error scenarios, and store the corresponding Process Request ID for each case.



Download the ESS job execution details

  • Uses the Request ID stored from the previous step and download the logs zip file.
  • Unzip the excution details
  • List all the execution files .log, .txt etc.





Call twilio common service for email:

add 3 repeat nodes and use for Success, load error and import error scenarios:

Success Scenario

If the status = SUCCEEDED:

  • Downloads the .txt file generated from the “Import Journal: Child” process.
    Example: 3503927.txt
  • Uses Twilio Email Service to send the success email with the attached .txt file to the mail distribution list.

Error Scenario

If the summary status is not succeeded:

  • The integration downloads relevant log and data files to capture failure points.
Import Failure Error
  • Attaches the .txt file from the “Import Journal: Child” process.
    Example: 3503927.txt

Load File Error

  • Attaches:
    • The .csv file from “Load Interface File for Import”
      Example: GLExtract_20250825.csv
    • The .log file from “Load File to Interface”
      Example: 3503909.log

All these files are attached and sent via Twilio Email Service to the mail group.


4. DataDog Logging

For every error or warning in the process:

  • The integration pushes a structured JSON log entry into DataDog containing:
    • File name
    • ERP Request ID
    • ESS job name and status
    • Timestamp and failure message
  • This enables real-time visibility into OIC integration health and ERP job-level outcomes.



End-to-End Flow Summary

Step Component Action
1 Scheduler Triggers Main Integration periodically
2 Main Integration Lists files in Object Storage and invokes Child Integration for each file
3 Child Integration Downloads, zips, and imports file into ERP
4 ERP Bulk Import Executes “Load Interface File for Import” and “Import Journal: Child” jobs
5 Callback Integration Monitors job execution, downloads logs, and triggers Twilio email
6 DataDog Receives failure logs for monitoring and alerting

Conclusion

This automated design using OIC, Twilio, and DataDog ensures:

  • A fully hands-free, monitored GL import process,
  • Automatic email alerts with detailed attachments,
  • Real-time visibility into both success and failure events,
  • And zero manual follow-up for ERP job status.

By modularizing the integration into scheduler, main, child, and callback components, the process becomes reliable, reusable, and easy to maintain.

Reference:

OIC - How to Send Email Using Twilio SendGrid API in Oracle Integration Cloud (OIC)

Sunday, November 9, 2025

Microsoft Excel - Importing and exporting data

1. Importing Data into Microsoft Excel

Excel allows you to import data from various sources such as text files, databases, and the web.
Steps:

  1. Open Excel and go to the Data tab.
  2. Click Get Data → choose the source (e.g., From File, From Database, From Web).
  3. Browse and select your file or connection.
  4. Preview the data and click Load to import it into Excel.

2. Importing Data from Text Files

You can import .txt or .csv files containing delimited data (commas, tabs, etc.).
Steps:

  1. Go to Data → Get Data → From File → From Text/CSV.
  2. Choose the file and click Import.
  3. In the preview window, choose the correct delimiter (comma, tab, etc.).
  4. Click Load or Transform Data to edit before loading.









3. Importing Data from Microsoft Access

Use this option to connect Excel with an Access database and pull tables or queries.
Steps:

  1. Go to Data → Get Data → From Database → From Microsoft Access Database.
  2. Select the Access file (.accdb or .mdb).
  3. Pick the table or query you want to import.
  4. Click Load to bring the data into Excel.







4. (New Version) Import Data from Text Files into Excel

The latest Excel versions (Office 365, Excel 2021+) use Power Query for importing.
Steps:

  1. Go to Data → Get Data → From File → From Text/CSV.
  2. Power Query opens automatically — preview and format data.
  3. Choose Load to sheet or Load to Data Model for advanced analysis.






5. (New Version) Import Data from a Database into Excel

You can import directly from SQL Server, Oracle, or other supported databases.
Steps:

  1. Go to Data → Get Data → From Database.
  2. Select the database type (e.g., SQL Server).
  3. Enter the server name, database, and credentials.
  4. Select the tables or views to import, then click Load.



6. Microsoft Excel Legacy Import Options for New Excel Versions

Newer Excel versions still support legacy wizards for backward compatibility.
Steps:

  1. Go to Data → Get Data → Legacy Wizards → From Text (Legacy).
  2. Choose your file and follow the step-by-step import wizard.
  3. Use this when older formatting or delimiters don’t work properly in Power Query.




7. Exporting Data to a Text File

You can export your Excel worksheet data as a .txt or .csv file for use in other applications.
Steps:

  1. Go to File → Save As.
  2. Choose a location and select Save as type → CSV (Comma delimited) or Text (Tab delimited).
  3. Click Save.
  4. Excel will warn that only the active sheet is saved — click OK to confirm.




Featured Post

OIC - Using Dynamic Paths for OCI Object Storage in OIC Integrations | Handling “Subfolders” in OCI Object Storage bucket

Use Case In Oracle Integration Cloud (OIC), when working with Oracle Object Storage, “folders” inside a bucket are not real directories . T...