Tuesday, September 17, 2024

OIC Gen3 - using Recipes and Accelerators

Recipes and accelerators, collectively known as prebuilt integrations, are preassembled integration solutions.

A recipe or accelerator contains all the resources required for a specific integration scenario. The resources include integration flows, connections, lookups, and certificates. Use a recipe or accelerator to quickly get started building an integration.

Recipes types:

they are either project-based or package-based:

  • When you install a project-based recipe, you can access it on the Projects page in Oracle Integration.
  • When you install a package-based recipe, you can access it on the Packages page in Oracle Integration

How to determine whether a recipe in the Integration Store is project-based or package-based:

Before you install it, hover over the recipe card and click Open Details Open details icon to expand the information pane. The recipe details show either Project code (for a project-based recipe) or Package name (for a package-based recipe).

Differences Between Recipes and Accelerators

Recipes are sample use cases that give you a head start. Accelerators are run-ready business integrations or technical patterns of larger scale.

Here's a comparison of recipes and accelerators.

RecipesAccelerators

A recipe is a sample use case that gives you a head start.

A business accelerator provides an end-to-end business process or use case (for example, marketing to lead, hire to retire, or concept to launch).

A technical accelerator provides a common technical solution (for example, sending alerts on failures). They are meant to be called by another integration.

Not supported by the producer

Managed and supported by the producer

Fully editable in the Oracle Integration designer

Configurable and extendable

Can't auto-upgrade to new versions

Upgrades provided by producer

Configurator in Oracle Integration

Configurator in Oracle Integration and as native SaaS

Always free

Paid offering (as decided by producer)

How to find a recipe or accelerator:

  1. On the Oracle Integration Home page, in the Get started section, click Browse store.
    The Integration Store is displayed. Note that you can toggle the display between a list view and a card view.
    The Integration Store with search, filter, and view tools
  2. Use the Search Open Input Search iconFilter Filter icon, and view tools to narrow your search, filter and sort the list, and change how the list is displayed.
How to install a recipe or accelerator:
  1. Find the recipe or accelerator that you want to install.
  2. Hover over the recipe or accelerator card and click Install Install icon.
    When you install a recipe (or an accelerator), it's installed as a package or project. 

Note:

For project-based accelerators, Oracle periodically releases updates to the Integration Store. You can upgrade an installed accelerator project to a newer version automatically without making manual changes to your existing installation. 

Accelerators extensions and versions:

How to extend an accelerator:

  1. First install the accelerator. Go to oracle home page >> view all >> select the version and install.
  2. Once installed, for projects , click the action ellipsis(...) of the integration and click extend.
  3. Add an extension group before or after the invoke and add a data stitich to enrich data to the invoke or add some fault handling logic after the invoke etc.
  4. Save and activate.



How to version an accelerator:
Suppose an accelerator is already installed  and extension logic added and now new upgrade comes, then we can upgrade the version as below:

  1. Go to oracle home page >> view all accelerators >> select the acceleraror to upgrade and select latest version and  install.
  2. Once installed, for projects , click the ellipsis of the integrations and click refresh enspoints
  3. Then marge the extension changes of the previous version to new version.  Click the action ellipsis of the integration >> click extend >> select the previous version to merge to this new version
  4. Save it.






Reference

https://docs.oracle.com/en/cloud/paas/application-integration/int-get-started/get-started-integration-recipes-and-accelerators.html

OIC ERP/HCM - How to see or download file from UCM for support.

Usecase: Here we will see how to check or download file from ucm for support help or verification or testing purposes.

Steps: 

Step1: go to content server https://<host instance>/cs and click on search


Step2: search the file with content id and click on info sign


Step3: click the file link below and it will auto download the file. 


This above steps are required to get the file from ucm which we can use or check for support or dev testing verification.

OIC - We need to execute a schedule job in oic that will run only on weekdays excluding public holidays?

To execute a scheduled job in Oracle Integration Cloud (OIC) that runs only on weekdays (Monday to Friday) excluding public holidays, We can follow these steps:

1. Set Up the Weekday Recurrence in OIC:

we can use a recurrence pattern to run the job only on weekdays. choose the iCal format.Set the frequency to DAILY and restrict it to Monday to Friday using the BYDAY parameter.

For example:

FREQ=DAILY;BYDAY=MO,TU,WE,TH,FR

This ensures the job only runs Monday through Friday.

2. Exclude Public Holidays:

OIC doesn't provide a built-in way to exclude public holidays automatically. You'll need to handle this manually or programmatically. Here are some approaches:

Option 1: Manually Disable the Job for Public Holidays

Create a list of public holidays, and manually disable the schedule on those days.You could either:Pause the integration the day before the holiday, orModify the recurrence rules around those dates.

Option 2: Use a Lookup Table for Public Holidays

Create a lookup table that contains the list of public holidays.Modify the integration to check the current date against this list at runtime.If the current date is a public holiday, the job won’t proceed further.

Option 3: Programmatically Handle Holidays in the Integration Flow

In the integration logic, before running the core job, add a step that checks if the current date is a public holiday (by querying a list of holidays stored in a file or a database).If it's a public holiday, you can skip the execution.

You can combine options 1 and 3 for flexibility. Manually manage the public holidays for the current year and set up a query mechanism for holidays that changes each year.

Monday, September 16, 2024

OIC gen3 - Integration patterns and Define schedules

Integration patterns:

  1. Application : Real time call or data feed from the subscriber.
  2. Schedule: Runs at specific dates and times defined in a schedule. Mainly used for bulk or batch integration or file processing.
  3. Event: starts when an event is published. It is like EDN or even driven Architecture or publish subscribe model.
Creating a schedule: 
There are two ways, we can create or define a schedule of a schuduled integration.
  1. On the design canvas , click the ellipsis on schedule node and edit schedule definition.
  2. Add a schedule on the integrations page before or after the integration has been activated.
Schedule types:
  1. Simple schedule
    • We can define only once - hour minutes, days months, weeks
    • We can also define recurring schedule - hourly, minutely, daily, monthly, weekly
    • Frequency can not be less then 10 mins.
    • We can set specific timezone, add a time window to run or never expire mode by default.
  1. Advanced schedule or iCal expression
    • We use calender expression 
    • Frequncy can be set minutely(<10 mins).
    • We can also define multiple schedule frequencies together using &.
Examples:

Example 1: This example runs on the 1st, 10th and 15th days of the month at 5:15am, 10:15am , 3:15pm and 8:15 pm.

FREQ=MONTHLY;
BYMONTHDAY=1,10,15;
BYHOUR=5,10,15,20;BYMINUTE=15;

Example 2: multiple schedule frequencies together: The example provided runs every day between 5:30 PM and 7:30 PM, executing every 10 minutes during these hours.

FREQ=DAILY;BYHOUR=17;BYMINUTE=30,40,50;BYSECOND=0;
&FREQ=DAILY;BYHOUR=18;BYMINUTE=0,10,20,30,40,50;BYSECOND=0;
&FREQ=DAILY;BYHOUR=19;BYMINUTE=0,10,20,30;BYSECOND=0;


Note: we can only start the schedule if the integration is activated. Once started, we can see the future runs of the schedule. 

OIC Gen3 - About Tracing Levels while activating integration

OIC has following 3 tracing levels:

  1. Production
  2. Audit
  3. Debug
Production:
  • Data is retained for 32 days.
  • Log all flow actions
  • Log only invoke /logger actions.
  • Wihin loops( upto 1000 iterations)
Audit:
  • Data is retained for 8 days.
  • Logs same as production option
  • Also logs external payloads
Debug:
  • Data is retained only for 24 hrs.
  • Logs same as audit option
  • Also logs all actions within the loops.

Can we modify the tracing dynamically?

Yes, We can also dynamically or runtime modify the tracing level without deactivating the integration. 
Steps:
Integration >> "..." actions >> configure activation >> modify the tracing level.

What occurs or observe if we deactivate any integration:
  • Activated integrations cannot be edited.
  • Stops processing any new messages.
  • Pending requests remain unprocessed and are lost.
  • Existing history, monitoring, and runtime data are lost.
  • In-flight instances will fail, which can be observed.
  • You must stop the schedule before deactivation.

OIC Gen3 - Integration Lifecycle

 



Sunday, September 15, 2024

OIC Gen3 - Working with Projects | Packages vs Projects

Differences between Packages and Projects:

Area packages projects
Overall Usage & Goals Packages provide for organization of integrations to allow for import and export of related resources. In addition to resource organization, projects provide for improved release management and a single unified workspace.
Access Control Resources within packages are visible to all OIC users the same as all other global resources. Projects provide for fine-grained access control.
Deployment Creating a CI/CD pipeline to deliver updated integrations to other environments requires work outside of OIC Projects provide built-in deployment capabilities with release management and controlled deployment capabilities.
Observability Packages do not provide separate monitoring capabilities. Projects provide internal Observe pages for monitoring activated project integrations.


Convert a package to project:

You heard it right. We can convert a package to project. Accelerator and recipe packages are not supports for conversion.

Projects notes to remember:
  1. A project allows to view and work on  integrations, connections, lookups and libraries in one page. 
  2. A project has 3 sections - Design, Deploy and Observe.
  3. There is a "share" option from which we can give access to others to this project. By default a project and its resources is private and only the owner can access.
  4. From design >> we can design our integrations, create lookups, connections etc. And run or test the integrations.
  5. From observe >> we can view the integrations, instances, future runs and audit info
  6. From Deploy >> we can create a deployment of the project, we can select the integrations of the project to be part of the deployment. After creating the deployment, we can export it and use or import it to higher instance.
  7. For packages, export creates a .par file and for projects, export creates a .car file.

Oic Gen 3 links

OIC generation 3 links:

  1. Oracle Integration Generation 3 New Features
  2. Integration patterns and how to define schedules
  3. About RBAC - Resource based access control
  4. How to publish and subscribe events | What is oracle integration events | what is publish event action
  5. Read in segment - set your own chunk or sement size
  6. About different Actions available in OIC Generation 3
  7. Parallel action
  8. Stage file action
  9. File Server Action
  10. File adapter vs FTP adapter
  11. FTP does not support polling as trigger then what is the alternate solution to achieve FTP Polling?
  12. Max file size supported by Rest, soap, file and ftp and database adapters | Service limit
  13. Data Stitch action
  14. OIC Error Handling
  15. OIC gen3 - Working with XSLT constructors
  16. Working with Projects | Packages vs Projects
  17. About Tracing levels while activating integration
  18. TBD







OIC Gen3 - Stage file action

Stage or vfs or virtual file system is a temporary location in the oic local file system which stores temporary files required for processing in an oic instance. It provides a file reference to the file which we can use to access the data or file content.

Stage file action operations:

  1. List Files
  2. Unzip file
  3. Zip file
  4. Decrypt file
  5. Encrypt file
  6. Write file
  7. Read entire file
  8. Read file in segments
Points to remember:
  1. To decrypt use private and to encrypt use public PGP keys. First we need to upload pgp keys in oic certificates sction, so that we can use them later in stage action.
  2. To read or write file schema configuration options, EDI document is also avaiable along with csv, json, xsd, xml document (single or no namespace)
  3. To write file with schema option,it supports max 10mb file size.
  4. To write file reference or opaque file. We have to use a opaque xsd as sample file. And in the mapper, use encodeReferenceToBase64() function to convert from reference to base64 content.
  5. OIC automatically handles the creation, deletion and cleanup of the temprary files in the vfs or local file system. When the instance gets completed, the files in vfs will be flashed out.
  6.  To "read file in segments", it supports min segment size: 200 and max segment size: 2000.
  7. To "read file in segements", it does not support json sample for schema options.
  8. To "read file in segments", by deafult, the rows are processing in parallel. To process in sequentially, select the check box for sequentialprocessing.
  9. To "read file in segments", remove trailer only supports for csv file. We can do remove "last n rows"  or "last row" of the file.
  10. In stage, it can store file size max upto 1 gb file.
  11. Csv, json, or xml format file or data what ever we are trying to write or read, internally it is translated to xml structure, thats why if we open mapper, will see it as xml in both source or target points.


OIC Gen3 - File Server action

FTP adapter vs File server action:

  • Using file server action, we can access to OIC embeded file server, Though using ftp adapter we can also connect to OIC embeded file server. 
  • Where as Ftp adapter use sFTP protocol and user credential to connect to oic file server. For file server action, it uses internal apis http/TLS and no user credential required.
  • If File server is enabled from OCI console, then only file server action will be displayed in the action pallet.

When to use FTP adapter:

  • Read a file into the vfs for further processing.
  • Encrypt or decrypt a file
  • Sign or verify a file.

file service action supports following Five operarions:

  1. List Directory
  2. Get file reference
  3. Write file
  4. Move file
  5. Delete file


Notes

  • For list directory, max files supports 1000 only.
  • Using file server action, if we write, move or delete files, in the response, success field boolean value will indicate where the operation completed successfully or not.

OIC Gen 3 - File adapter vs FTP adapter




OIC Gen 3 - FTP does not support polling as trigger then what is the alternate solution to achieve FTP Polling?

What: FTP Adapter does not support polling as a trigger.

Alternate solution: follow the below steps

  1. Take a scheduled integration
  2. List files as invoke
  3. If list count >0
  4. Download file to stage
  5. Process files to downstream apps.


Note: File adapter supports polling file as trigger to shared file system.

Saturday, September 14, 2024

OIC Gen 3 - Max file size supported by Rest, soap, file and ftp and database adapters | Service limit

The max file size supported by rest, soap, ftp, file and database adapters as below:

Rest Adapters: 

  • Structured message for trigger and invoke: 100mb
  • Binary or attachments: 1GB
  • With connectivity agents: 50mb

SOAP Adapters: 

  • Structured message for trigger and invoke: 100mb
  • Binary or MTOM , attachments: 1 GB
  • with connectivity agents: 50mb

FTP Adapters: 

  • Structured message for trigger and invoke: 100mb
  • Binary or without schema: 1GB
  • with schema and connectivity agents: 50mb

File Adapters: 

  • Structured message for trigger and invoke: 50mb
  • Binary or attachments: 1GB
  • with connectivity agents: 50mb

Database Adapters: 

  • Trigger or Polling (schema based):  50mb
  • Invoke database select: 100mb
  • Invoke store procedure / operation on table / run puresql operation with schema transformation: 10MB
  • With connectivity agents: 50mb

Reference:

https://docs.oracle.com/en/cloud/paas/application-integration/oracle-integration-oci/service-limits.html

Friday, September 13, 2024

OIC Gen 3 - Data Stitch action

Data stitch operation types:

  1. Assign : suppose , you use a invoke inside a scope and want to take the invoke variable data to outside of the scope, that case mainly we use assign.
  2. Append : suppose, we have a scenario , where we need to incrementally build a message payload or partial update of the payload.
  3. Remove : to remove some data from an existing payload.


Assign:

OIC - Global variable and Data Stitch action using assign operation

Append:

OIC - Append operation in Data stitch action | Incrementally build a messags payload using Data stitch

Remove:

TBD


Thursday, September 12, 2024

OIC Gen3 - about actions

 How to Add actions:

  1. Open the actions pane
  2. Use the inline Menu
List of actions available:



OIC Gen 3 - Working with XSLT constructors

If we open toggle function pallet, by default we would not see the XSL constructors, we have to click the XSLT button to enable to see it.



If we drag the xsl statement and hover or drop it in front of the element, then it will be added as child component of the element.



If we drag the xsl statement and hover or drop it at the back of the element name, then it willbe added as parent component of the element.



Note:

  • Select If statement to specify single condition
  • use choose when otherwise to specify multiple conditions. 
  • For looping logic, use for-each statment. 
  • We can also use xslt output >> literal, text, attribute, copy-of and value- of  to set default values in the mapper. 
  • Using a copy-of element, we can perform a deep copy of elements from source to target with a copy-of constructor if both have same set of elements.


Xslt code editor use cases:
  • Create internal variables
  • Correlate multiple data  sources grouped by key field using the xsl for each group constructor.
  • Create target name value pairs.
  • Implement push style xslt (call template, apply template)
  • Write custom xslt functions
  • Copy node sets

Edit xslt code in oracle jdeveloper:
  • Export the code.
    • Design the integration flow logic
    • Open the empty map action
    • Map one data value from each required structure to the target
    • Validate and clodw the mapper
    • Save and the close the integration
    • Export the entire integration archive (.iar file), for projects export the .car file.
  • Import the iar file oracle jdeveloper
    • Create an OSB application and project
    • Import the integration archive to the osb project.(service bus resources >> select zipped or archived files >> selct file >> import.
  • Locate and open to edit the .xsl file
  • Import the .xsl file(not the entire integration) back to the oic mapper.

OIC Gen 3 - Error handling | OIC Error Hospital | Global and scope level fault handlers | Fault and End actions

Why we need to implement error handling / designing beyond the happy path:

  • What happens When don't implement fault handling logic of any kind. By default every error occurs runtime, deliver to the OIC error hospital which is part of the oic runtime environwmnt, hosting all your deployed integrations, this can include any faults like runtime or business faults, request timeouts, invalid payloads, internal error etc.
  • When error hospital catches a fault,  the integration flow stops or terminated immediately
  • If you work on the projects, it will show on the observe page.
  • If the integration is sync app driven and fault happens, the error hospital sends the same fault details to the client.
  • All the errors are visible in the visibility secrion >> error page.

Instead of allowing the error hospital to catch every fault, we can intentionally catch all fault using the global and scope fault handlers.

Best practice Examples of error handling:

  1. Log the error but continue with the integration flow.
  2. Invoke a secondary service for backup processing.
  3. Log the error and then terminate the integration flow
  4. Invoke another service for notification or error handling processing.
  5. Reply to the integration glow's client with a custom error response.
  6. Send an email notification to an external stakeholder or an internal administrator.
  7. Invoke an OIC  process to initiate a process workflow involving manual intervention.

Implementing Global Fault handler:

  1. By default, there is default global fault handler, where it has re-throw fault action which rethrows the fault to OIC error hospital.
    1. All uncaught faults and errors go to global fault handler
    2. Throw new fault from scope fault handler
    3. Re throw fault from scope fault.handler.
  2. Change the default handling logic apt for your integration like 
    1. add logger, 
    2. email notification.
    3. Map custom data for a fault return for sync integrations
    4. Mitigate the error contion with alt logic
    5. Invoke an error handling service(another oic integration or external service)

Global fault handlers End actions;

  1. Re throw fault : it will send the fault to error hospital, 
  2. Throw new fault: it will send a custom fault details to error hospital
  3. Stop:  Terminate the integration flow, no error hospital will be involved. Separate fault handling logic wii be there before the stop action.
  4. Fault return:  explicitly sending the business faults to the client for sync service. No error hospital is involved.
  5. Return : fault mitigation logic, invoking a backup service, then send the normal.data to the client. No error hospital is involved.

For each invoke , try to create a scope and define scope fault handler.


Faults occurring within handlers are caught up by the next higher handler.

Inner scope >> outer scope >> global >> error hospital


Scope or global fault objects will be created to fetch fault details in the failt handler.


Note: 

For my project, I have followed as below:

  1. For each invoke>> take it in one scope and in the default fault handle >> throw new fault with code, reason and details.
  2. In global fault handler >> we have send the the global failt details to data dog SaaS application for further support.
  3. We have created a lookup like common_error_details_lookup where it captures the key, error type, error code, reason and details. Using the key like 'UCMUPLOADFAILED' we can fetch custom error details.



Featured Post

11g to 12c OSB projects migration points

1. Export 11g OSB code and import in 12c Jdeveloper. Steps to import OSB project in Jdeveloper:   File⇾Import⇾Service Bus Resources⇾ Se...