Home » RDBMS Server » Server Utilities » Getting SQL Loader summary results from it log file (Oracle SQL Loader on 10G, 10.2.0, Windows XP and Window 2003 Server)
Getting SQL Loader summary results from it log file [message #515872] Wed, 13 July 2011 22:42 Go to next message
rsager
Messages: 17
Registered: June 2011
Location: Sydney, Australia
Junior Member
Hi folks...

Another curly one....

After SQL Loader runs and works correctly to load data from a
csv file, and it creates a log and bad file output. From pass postings I created and received valuable replies....I got all this working in that, the csv data would load into the appropriate Oracle 10g table.

These incoming csv files have differences and its going to be a job to figure out the best way to load the data, use PL/SQL (as I done to load dates on different formats into the Oracle table in the same format), or call SQLPLUS to manipulate data, etc. Some of these csv files have total summary lines at the end of the file, while others do not. Therefore, I was thinking of reading the output SQL LOADER log file produced from a previous execution to retrieve the Total logical records: skipped, read, rejected and discarded.

So taking previous suggestions to use external tables (concepts) I start down that line but got serious ORA errors. The SQL Loader runs from the same Windows network drive, retrieves the csv file (from a network drive) successfully, and writes the log and bad file back to the appropriate network drive directory
'J:\Grower\Round_Modules\Crop2011\' in subdirectories.

So...what I have done...

I searched the forum and found a old posting >

"sql loader log file result in table? (merged) [message #270469] "

As the SYS user using SQL*Plus I...
================================
SQL> conn sys/******@APPSV2PTLIVE10 as sysdba
Connected.
SQL> CREATE DIRECTORY ext_tab_dir AS 'J:\Grower\Round_Modules\Crop2011\Logs' ;

Directory created.

SQL> GRANT READ ON DIRECTORY ext_tab_dir TO ptlive;

Grant succeeded
================================


As the PTLIVE user using SQL*Plus I...
================================
SQL> conn ptlive/******@APPSV2PTLIVE10
Connected.

SQL> create table load_log (line varchar2(2000))
2 organization external (
3 type oracle_loader
4 default directory ext_tab_dir
5 access parameters (
6 records delimited by newline
7 nobadfile
8 nologfile
9 nodiscardfile
10 fields terminated by '<roger>'
11 missing field values are null
12 (line)
13 )
14 location ('Batch_2011Jul12_113124_7620.log')
15 )
16 reject limit unlimited;

Table created.


SQL> desc load_log
Name Null? Type
------------------------------------------------------------------------ -------- -----------------
LINE VARCHAR2(2000)

SQL> select * from load_log;
select * from load_log
*
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file Batch_2011Jul12_113124_7620.log in EXT_TAB_DIR not found
================================

I then found a note via google that was interesting
http://forums.oracle.com/forums/thread.jspa?threadID=356054

However, I did not figure out what I was doing wrong. I even
thought it was a folder premissions issue and tried a directory
on by PC with full access to the user "everyone", recreated the
directory in SQLPLUS to point to my PC dir with full privs, and ran the create table load_log, but the same ORA errors occur.

any suggestions?

PS: will upload the SQl Loader log file.

Ideally I would have like to load the results into custom table
I have created:

CREATE TABLE PTLIVE.MODULE_CSV_SQLLOADER_SUMMARY
( IMPORT_CSV_FILE_NAME VARCHAR2(256 ),
TOTAL_RECORDS_SKIPPED NUMBER (5),
TOTAL_RECORDS_READ NUMBER (5),
TOTAL_RECORDS_REJECTED NUMBER (5),
TOTAL_RECORDS_DISCARDED NUMBER (5),
TOTAL_RECORDS_LOADED NUMBER (5),
DATEIMPORTED DATE default SYSDATE
)
TABLESPACE PTLIVE_DATA
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 5M
NEXT 5M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
BUFFER_POOL DEFAULT
)
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;

Where the uploaded log data ..

Total logical records skipped: 1
Total logical records read: 527
Total logical records rejected: 0
Total logical records discarded: 13

would load into the table above and
IMPORT_CSV_FILE_NAME CONSTANT "Batch_2011Jul12_113124_7620.csv"

However, I am willing to use this temp table
MODULE_CSV_SQLLOADER_SUMMARY as a working table, then use SQL to substring and load the desire results into the correct table.

Cheers
Roger






Re: Getting SQL Loader summary results from it log file [message #515879 is a reply to message #515872] Thu, 14 July 2011 00:14 Go to previous messageGo to next message
Littlefoot
Messages: 21806
Registered: June 2005
Location: Croatia, Europe
Senior Member
Account Moderator
Well, external file's drawback* is that the file has to reside on a database server, in a directory you created and pointed Oracle DIRECTORY object to. It means that "J:\Grower\Round_Modules\Crop2011\Logs" is supposed to be on a server, not your PC. I suspect that this might be the culprit.

_______________
*drawback if compared to SQL*Loader, which successfully runs on client PCs
Re: Getting SQL Loader summary results from it log file [message #516058 is a reply to message #515872] Thu, 14 July 2011 17:01 Go to previous message
rsager
Messages: 17
Registered: June 2011
Location: Sydney, Australia
Junior Member
G'day Littlefoot...

Thanks for the reply... and I think you are correct...like many Oracle things, there are plus's and minus's....and this may be one of the negative things about using external tables.
Even if I got it to work, I thought by writing another SQL Loader bat file that calls the control file, then execute a bit of SQL after that....would be quicker to find the solution. This is what I have done....

First the control file used into SQL Loader
=============================================
load data
infile 'J:\Grower\Round_Modules\Crop2011\Logs\Batch_2011Jul12_113124_7620.log'

BADFILE 'J:\Grower\Round_Modules\Crop2011\Logs\RoundModuleLoader_7620_TEST.bad'
replace into table PTLIVE.MODULE_CSV_SQLLOADER_SUMMARY
WHEN (1) = 'Total logical records skipped'
FIELDS TERMINATED BY ":"
(TOTAL_RECORDS_SKIPPED position (35:45 ),
IMPORT_CSV_FILE_NAME CONSTANT "Batch_2011Jul12_113124_7620.csv"
)
into table PTLIVE.MODULE_CSV_SQLLOADER_SUMMARY
WHEN (1) = 'Total logical records read'
FIELDS TERMINATED BY ":"
(
TOTAL_RECORDS_READ position (35:45 ),
IMPORT_CSV_FILE_NAME CONSTANT "Batch_2011Jul12_113124_7620.csv"
)
into table PTLIVE.MODULE_CSV_SQLLOADER_SUMMARY
WHEN (1) = 'Total logical records rejected'
FIELDS TERMINATED BY ":"
(
TOTAL_RECORDS_REJECTED position (35:45 ),
IMPORT_CSV_FILE_NAME CONSTANT "Batch_2011Jul12_113124_7620.csv"
)
into table PTLIVE.MODULE_CSV_SQLLOADER_SUMMARY
WHEN (1) = 'Total logical records discarded'
FIELDS TERMINATED BY ":"
(
TOTAL_RECORDS_DISCARDED position (35:45 ),
IMPORT_CSV_FILE_NAME CONSTANT "Batch_2011Jul12_113124_7620.csv"
)


Note: the infile...
===============================
The results:
===============================
col TOTAL_RECORDS_SKIPPED heading "Skipped"
col TOTAL_RECORDS_READ heading "Read"
col TOTAL_RECORDS_REJECTED heading "Rejects"
col TOTAL_RECORDS_DISCARDED heading "Discards"
col TOTAL_RECORDS_LOADED heading "Loaded"

select * from MODULE_CSV_SQLLOADER_SUMMARY ;
IMPORT_CSV_FILE_NAME Skipped Read Rejects Discards Loaded DATEIMPOR
---------------------------------------- ---------- ---------- ---------- ---------- ---------- ----
Batch_2011Jul12_113124_7620.csv 1 15/JUL/11
Batch_2011Jul12_113124_7620.csv 527 15/JUL/11
Batch_2011Jul12_113124_7620.csv 0 15/JUL/11
Batch_2011Jul12_113124_7620.csv 13 15/JUL/11
==============================
Run the SQL to consolidate the totals onto 1 row:

UPDATE MODULE_CSV_SQLLOADER_SUMMARY t1
SET t1.TOTAL_RECORDS_READ =
(SELECT t2.TOTAL_RECORDS_READ FROM MODULE_CSV_SQLLOADER_SUMMARY t2
WHERE t2.import_csv_file_name = t1.import_csv_file_name
and t2.import_csv_file_name = 'Batch_2011Jul12_113124_7620.csv'
AND t2.TOTAL_RECORDS_READ is not NULL)
WHERE TOTAL_RECORDS_SKIPPED is not NULL;


UPDATE MODULE_CSV_SQLLOADER_SUMMARY t1
SET t1.TOTAL_RECORDS_REJECTED =
(SELECT t2.TOTAL_RECORDS_REJECTED FROM MODULE_CSV_SQLLOADER_SUMMARY t2
WHERE t2.import_csv_file_name = t1.import_csv_file_name
and t2.import_csv_file_name = 'Batch_2011Jul12_113124_7620.csv'
AND t2.TOTAL_RECORDS_REJECTED is not NULL)
WHERE TOTAL_RECORDS_SKIPPED is not NULL and
TOTAL_RECORDS_READ is not NULL;


UPDATE MODULE_CSV_SQLLOADER_SUMMARY t1
SET t1.TOTAL_RECORDS_DISCARDED =
(SELECT t2.TOTAL_RECORDS_DISCARDED FROM MODULE_CSV_SQLLOADER_SUMMARY t2
WHERE t2.import_csv_file_name = t1.import_csv_file_name
and t2.import_csv_file_name = 'Batch_2011Jul12_113124_7620.csv'
AND t2.TOTAL_RECORDS_DISCARDED is not NULL)
WHERE TOTAL_RECORDS_SKIPPED is not NULL and
TOTAL_RECORDS_READ is not NULL and
TOTAL_RECORDS_REJECTED is not NULL;

UPDATE MODULE_CSV_SQLLOADER_SUMMARY t1
SET t1.TOTAL_RECORDS_LOADED = TOTAL_RECORDS_READ - TOTAL_RECORDS_REJECTED - TOTAL_RECORDS_DISCARDED
WHERE TOTAL_RECORDS_SKIPPED is not NULL and
TOTAL_RECORDS_READ is not NULL and
TOTAL_RECORDS_REJECTED is not NULL and
TOTAL_RECORDS_DISCARDED is not NULL;


DELETE FROM MODULE_CSV_SQLLOADER_SUMMARY
WHERE TOTAL_RECORDS_SKIPPED IS NULL and
import_csv_file_name = 'Batch_2011Jul12_113124_7620.csv' ;

commit;
======================
the results:
=====================
SQL> select * from MODULE_CSV_SQLLOADER_SUMMARY ;

IMPORT_CSV_FILE_NAME Skipped Read Rejects Discards Loaded DATEIMPOR
---------------------------------------- ---------- ---------- ---------- ---------- ---------- ----
Batch_2011Jul12_113124_7620.csv 1 527 0 13 514 15/JUL/11
===========================

The results and a solution....which is easy.
Cheers
Roger


[Updated on: Thu, 14 July 2011 17:05]

Report message to a moderator

Previous Topic: SQL Loader handling different date formats in same csv file
Next Topic: Migrate 10gR2 to 11gR2 on a different server
Goto Forum:
  


Current Time: Thu Mar 28 19:13:33 CDT 2024