Problem: I have a large CSV that i want to insert into DB2 table with Rails
Description: The CSV is about 2k lines/8K characters. The CLOB column is set up to handle over 10K characters. I can insert the CSV just fine though RubyMine database console. However my app crashes.
ActiveRecord produces one huge insert query. Code:
Logger.create(csv: csv_data.to_s)
DB2 returns an error:
ActiveRecord::JDBCError: [SQL0102] String constant beginning with 'foobar' too long.
I can insert huge PDF files into BLOB columns just fine using similar code. I tried creating the record first and then updating it with data, no difference.
This problem is the same as this. Except I need a Rails solution, rather than general one
Found a hack around this by splitting the csv_data
into chunks and appending them to the column
update_attribute(:csv, '') if self.csv.nil? # Can't CONCAT to nil
# Split csv_data into chunks, concatenate each one to the field
csv_data.scan(/.{1,6144}/m).each do |part|
parm = ActiveRecord::Base.connection.quote(part)
ActiveRecord::Base.connection.execute("update #{Logger.table_name} set csv = CONCAT(csv, #{parm}) where id = #{self.id}")
end