Search code examples
oracleexport-to-csvsqlplusutl-file

Write to multiple CSV files within Oracle SQL script


I have a requirement to generate a series of reports (in csv format) based on data held in an oracle database (v19.20).

The reports are to be generated by running a shell script on a local server which can access the database, which will execute a SQL*Plus command to extract the data.

e.g.:

generateReports.sh

#!/bin/sh

function logger() {
        local ts=$(date +%D\ %R:%S)
        echo "${ts}:${0##*/} - ${1}"
        echo "${ts}:${0##*/} - ${1}" >> logs/generateReports.log
}

function prop {
    grep "${1}" database.properties|cut -d'=' -f2
}

function isOracleError(){

countOfORAErrors=`grep "ORA-" logs/ *.log | wc -l`
countOfSP2Errors=`grep "SP2-" logs/ *.log | wc -l`
countofcomplileerrors=`grep "compilation errors" logs/ *.log | wc -l`
 if [[ $countOfORAErrors -gt 0 || $countOfSP2Errors -gt 0 || countofcomplileerrors -gt 0 ]]
    then
        logger "Last Script failed with  $countOfORAErrors : ORA- and $countOfSP2Errors :SP2 Errors,  halting execution"
        exit
    else
        logger "Script Completed Successfuly"
    fi
}

USERID="$(prop 'USERID')"
PASSW="$(prop 'PASSW')"
HOST="$(prop 'HOST')"
PORT="$(prop 'PORT')"
SERVICE="$(prop 'SERVICE')"
TNS_STRING="'(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=${HOST})(PORT=${PORT}))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=${SERVICE})))'"


sqlplus ${USERID}/${PASSW}@${TNS_STRING} @generateReports.sql</dev/null 2>&1 | tee -a logs/dbScripts.log

logger "######################################################################################################"
logger "Process completed : Successfully. "
logger "######################################################################################################"
exit

The number of reports to be generated depends on data held in table "REPORT_CATEGORY", which will be setup as follows:

CATEGORY_CODE DESCRIPTION
A Report Category A
B Report Category B
C Report Category C

For each category code found in the above table, I will need to execute a separate query (filtering on the category value in tables which hold data I need to report on) and write the results to a csv file on the local server where the shell script is being run. e.g.

  • /opt/reports/category_a.csv
  • /opt/reports/category_b.csv
  • /opt/reports/category_c.csv

Currently this is done using a cursor to loop through the codes, execute a SQL statement and use "utl_file" package to write the CSV output:

generateReports.sql:

SET SERVEROUTPUT ON
SET ECHO ON
WHENEVER SQLERROR EXIT FAILURE ROLLBACK

CREATE OR REPLACE DIRECTORY report_dir AS '/opt/reports';

--##########################################################################################################################

DECLARE
        v_file_handle utl_file.file_type;
        c_category_code report_category.category_code%type;
        CURSOR c_report_category is
                select category_code
                from report_category;

BEGIN
   OPEN c_report_category;
   LOOP 
   FETCH c_report_category into c_category_code; 
      EXIT WHEN c_report_category%notfound;
      v_file_handle:=utl_file.fopen('report_dir',c_category_code||'.csv', 'a');
      -- Execute Generic Query with filter on c_category_code and write results to v_file_handle using utl_file.put_line
      utl_file.fclose (v_file_handle);
   END LOOP; 
   CLOSE c_report_category;

END;
/

--##########################################################################################################################
EXIT

Whenever I try to execute this, I am hit with error:

ORA-04088: error during execution of trigger 'RDSADMIN.RDS_DDL_TRIGGER2'
ORA-00604: error occurred at recursive SQL level 1
ORA-20900: Invalid path used for directory: /opt/reports
ORA-06512: at "RDSADMIN.RDSADMIN_TRIGGER_UTIL", line 631
ORA-06512: at line 1
ORA-06512: at line 12

It seems that utl_file will only permit me to write to directories on the server where the database is being hosted, but I need to write this to the local server. The alternative appears to be using spool, but I am having difficulty switching off unwanted messages/logging and splitting to multiple files with this approach.

What is the best way to write to multiple CSV files from within the same SQL*Plus execution?


Solution

  • It seems that utl_file will only permit me to write to directories on the server where the database is being hosted.

    A DIRECTORY object in the database refers to a directory on the database server. The database server has no concept of the directories that can be accessed by any client (and would likely cause huge security issues if it could).

    UTL_FILE can only write to a file within a DIRECTORY object so, yes, it can only write to files on the database server.

    The alternative appears to be using spool, but I am having difficulty switching off unwanted messages/logging and splitting to multiple files with this approach.

    Yes, use SPOOL filename to start spooling to a file and then SPOOL OFF to stop spooling to a file and then you can start spooling to another file.

    You can refer to the SQL*Plus documentation and use the SET and COLUMN command (among others) to change how the output is formatted and what is, or is not, displayed.