Search code examples
oraclebashcygwin

how to export a varchar2 from oracledb containing special characters and process it from bash


I want to export data from an oracledb and then process with a bash script. However the output may contain special characters (like new-line, etc) which makes it hard to process the plain file. So I have tried to base64 encode the relevant strings with

UTL_ENCODE.base64_encode(utl_raw.cast_to_raw(...)) 

The output file will then contain lines like :

some_text|base64_encoded_string

which gives me easy to process text files however the decoding process doesn't work very well output is gibberish:

while read line
do
    name=${line%|*}
    str64=${line#*|}
    str_dec=`echo ${str64} | base64 --decode`
    echo "${name} : ${str_dec}"
done <${export_file}

That's probably because the string was first cast to raw in oracle.

So how can I correctly decode the string ?

Alternatively, how can I encode (protect) the string that can be easily decoded, that would help as well ?

btw.: bash is running in cygwin.


Solution

  • Actually it was pretty simple... I just had to cast the result back to a varchar2

    replace(
     replace(
      utl_raw.cast_to_varchar2(
       UTL_ENCODE.base64_encode(utl_raw.cast_to_raw(...))
      ), chr(10), ''
     ), chr(13), ''
    )
    

    and it decoded fine, had to remove cr/lf line breaks from the result aswell.