I've got a strange problem which I just can't figure out. I have currencies stored in my database as a decimal. I am reading from a CSV file and converting the string to a decimal to store in the database. The values don't seem to be stored correctly when inspecting the database, most are stored correctly but for some reason a value of 1000 that I have is stored as 1 and 2299 is stored as 2 so there are obviously problems with numbers over 999.99
I ran a database migration as follows:
def self.up
change_column(:transactions, :in, :decimal, :precision => 8, :scale => 2 )
change_column(:transactions, :out, :decimal, :precision => 8, :scale => 2)
end
Here is the code used to store the value from the CSV file:
def create
data = params[:dump][:file].read
FasterCSV.parse(data, :headers => true) do |row|
transaction = Transaction.new
transaction.date = Date.strptime(row[0], "%d/%m/%Y")
transaction.transaction_type = row[4]
transaction.details = row[3]
if row[7].to_f < 0
transaction.out = row[7].to_d.abs
else
transaction.in = row[7].to_d.abs
end
transaction.save
(The .abs is because money out values are just stored as negative values in the CSV file).
When I use the console to create a new transaction and convert a string of 1000 and store it using the same method however then this works fine and the value is stored as 1000.0.
Does anybody have any idea why this would be? I wouldn't have though this is a FasterCSV issue but I suppose it's possible if the CSV numbers are not being read properly.
Thanks for any help,
Tom
Have you looked at the raw CSV data for a row with a value >= 1000? It sounds to me like the data is formatted with commas every 3 digits, and the to_d
method will ignore everything after the first comma in that case.
>> '1,123.41'.to_d
=> #<BigDecimal:10593e0a8,'0.1E1',9(18)>
If that's the problem, just strip the commas using gsub
.
>> '1,123.41'.gsub(',','').to_d
=> #<BigDecimal:105932398,'0.112341E4',18(18)>