I have a select query that is run via cx_oracle library. The output of this query is stored as a list of dict and it needs to be saved in a json file for future use and iterations.
But the output of this query has "cx_oracleLOB object" and because of that, I am stuck with the error "TypeError: Object of type LOB is not JSON serializable" and am unable to write to json file. Please find my code:
con = cx_Oracle.connect(***)
cursor = con.cursor()
cursor.execute(q)
col_names = [row[0] for row in cursor.description]
rv = cursor.fetchall()
json_data = []
for result in rv:
json_data.append(dict(zip(col_names, result)))
with open("result.json",'w')as fp:
fp.write(json.dumps(json_data))
sample output of selectquery:
[
{
"name": "abc",
"age": 10,
"skills": <cx_Oracle.LOBobjectat0x00000123>
},
{
"name": "def",
"age": 10,
"skills": <cx_Oracle.LOBobjectat0x000004456>
}
]
From the cx_Oracle documentation Fetching LOBs as Strings and Bytes, create an output type handler:
def output_type_handler(cursor, name, default_type, size, precision, scale):
if default_type == oracledb.DB_TYPE_CLOB:
return cursor.var(oracledb.DB_TYPE_LONG, arraysize=cursor.arraysize)
if default_type == oracledb.DB_TYPE_BLOB:
return cursor.var(oracledb.DB_TYPE_LONG_RAW, arraysize=cursor.arraysize)
if default_type == oracledb.DB_TYPE_NCLOB:
return cursor.var(oracledb.DB_TYPE_LONG_NVARCHAR, arraysize=cursor.arraysize)
connection.outputtypehandler = output_type_handler
This assumes your LOBs will fit in memory (and are each 1 Gb or smaller), but you seem to be making that assumption already.
Update: in the latest version of cx_Oracle (now called python-oracledb), you can simply set oracledb.defaults.fetch_lobs = False
and don't need the type handler.