I have 1 Table in Oracle SQL Developer which containts 1 column as Float
.Data reader
have should return Decimal for oracle float datatype
as per the table given here : https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/oracle-data-type-mappings
But the problem is datareader returns double as datatype
for Float Column
as shown below :
But the problem is datareader returns double as datatype for NREAL as Float
and NFLOAT1 as float
and surprisingly
datareader returns Decimal for both the column
as shown below :
Code :
static void Test()
{
using (OracleConnection connection = new OracleConnection("connection string")
{
connection.Open();
using (OracleCommand command = connection.CreateCommand())
{
command.CommandText = "select id , NFLOAT from Numeric_Table";
using (OracleDataReader reader = command.ExecuteReader())
{
for (int i = 0; i < reader.FieldCount; i++)
{
var columnName = reader.GetName(i);
var dotNetType = reader.GetFieldType(i);
var sqlType = reader.GetDataTypeName(i);
}
}
}
}
}
I am using : Oracle.ManagedDataAccess.Client
Is that a bug inside Oracle.ManagedDataAccess.Client library
or I am doing something wrong?
Update : Based on comments I want to mention something :
Although I might have reference to different source documentation not applicable to oracle library I am using,bt still I am getting decimal datatype for my other 2 columns i.e NREAL and NFLOAT1 so why this behaviour is not consistent ?
The FLOAT
data type is not the whole picture here. SQL Developer is not showing you that NFLOAT
has a precision that's less than 50 while NREAL
and NFLOAT1
have precisions that are 50 or higher. Left unspecified when created, the precision defaults 126, which is the highest possible precision for the FLOAT data type.
A simple test query will demonstrate the difference:
SELECT CAST(0 AS FLOAT(49)), CAST(0 AS FLOAT(50)) FROM DUAL
The first column will be returned as System.Double
. The second will be returned as System.Decimal
.
In SQL Developer, right-click on the table and select "Edit..." That dialog does show the defined precision for the column.
If you want NFLOAT
to be returned as System.Decimal
increase the precision to at least 50.
I've been combing the documentation and running my own tests lately. It's a pet project and I don't use Oracle regularly so there's been a lot of trial and error.
The DUMP
function has been quite helpful in determining if I have the right data type in an expression before seeing what ODP .NET does to it.
For another example of how ODP .NET returns different data types depending on the precision, consider NUMBER(p,0)
.
Type Min p Max p
---------------------------
Int16 1 4
Int32 5 9
Int64 10 18
Decimal 19 38
I was hoping that Byte
or SByte
would be returned when p
is 1 or 2 but that didn't pan out.