I have a model with column amount
which is a decimal in the database. I'd like to ensure that only a BigDecimal
with certain precision is ever given when this model is instantiated. I've written specs to test the scenario when a Float
is provided, and then I have a before_create
callback that raises an error if it's not a BigDecimal
.
However, by the time the value gets to the before_create
callback, Rails has already converted it to a BigDecimal
. This is nice I supposed, and I can probably still check for precision, but since I don't know precisely how rails goes about converting, it would be nice to check for proper argument type and precision further up the chain.
Is there any way to do this?
From http://api.rubyonrails.org/classes/ActiveRecord/Base.html
Sometimes you want to be able to read the raw attribute data without having the column-determined typecast run its course first. That can be done by using the _before_type_cast accessors that all attributes have. For example, if your Account model has a balance attribute, you can call account.balance_before_type_cast or account.id_before_type_cast.