Usecase:
We have a BICC extract which stores the data .csv.gz file into the object storage and then get the file via odi to insert to adw temporary table but its failing as it has special chars in the file. So here we will see the steps how to find and remove the special chars and then process it.
Error details:
Solution steps:
Step1) From the failed scenario >> open session >> steps>> expand all >> right click on "Insert New Rows Custom IKM Oracle Control Append >> open aession task >> code >> Fetch C$ table from section
Step2) run the below command in db to get the error details:
BEGIN
DBMS_CLOUD.VALIDATE_EXTERNAL_TABLE(
Table_name => 'C$_0Ansnsmssksk22_EXT);
END;
Step3) check the below tables to get bad records details:
Select * from Validate$62_Log
Select * from validate$62_BAD
Step4) copy the bad records details to notepad ++ and see the bad characters
https://soalicious.blogspot.com/2024/07/notpad-how-to-find-special-characters.html
Step5) if bad record or special character exists then we have to remove it following the below steps :
##Go to object storage bucket path via terminal here, oci object storage buckets is mounted to unix or linux terminal.
/home/oracle/bucket-bicc-prod
##To see the .csv.gz file
ls -ltr
##Take a back up of the file
Cp file_name ../ODI_Backup
##Unzip file
gzip -d file_name.csv.gz
##Open the file using vi and search with "/ then bad record few words and enter
It will show places in highlighted.
Esc >> press i and remove special chars hidden as double space.
Save it with :wq
##Zip the file
gzip filename
##Run the insert code with load id
Once it is success, commit and verify the row counts wih the file count with below command
zcat filename | wc -l
##Move the .csv.gz file to archive folder
mv filename ../bucket-archive-prod
Step6) mark the failed scenario as success and restart the load plan