I have a query...
select
real_arrival_date-planned_delivery_date as days_late
from ifsapp.purchase_receipt_statistics
where real_arrival_date
and planned_arrival_date
are columns of type date
. The resulting days_late
will be a decimal number indicating how many days late our purchases arrived.
I execute this in C# using the Oracle Managed Client (though I suspect the regular unmanaged ODP.NET client will have this problem too).
var command = new OracleCommand("select real_arrival_date-planned_delivery_date as days_late from ifsapp.purchase_receipt_statistics");
var result = GetDataTable(command);
public static DataTable GetDataTable(OracleCommand command)
{
DataTable dt = new DataTable();
using (var connection = GetDefaultOracleConnection())
{
command.Connection = connection;
connection.Open();
dt.Load(command.ExecuteReader()); //InvalidCastException here
}
return dt;
}
I get an InvalidCastException
at the indicated spot in the code. Why is this happening, and how do I fix it?
This is happening because the results of subtracting two date columns in Oracle can have a really long decimal number, example: 54.4514351851852
. The Oracle client isn't prepared to convert that into a C# decimal
, thus you get an InvalidCastException
on what looks like a rather straightforward query.
The fix is to round the number off using Oracle's ROUND()
function. Change the query like this to round it to two decimal places:
select
ROUND(real_arrival_date-planned_delivery_date, 2) as days_late
from ifsapp.purchase_receipt_statistics
The Oracle client will then be able to successfully convert the value to a decimal, and the function will execute without an issue.