Search code examples
c#sql-serversqlclr

Set decimal precision in SQL CLR function in C#


I have a SQL CLR function written in C# that goes something like:

public partial class UserDefinedFunctions
{
    [Microsoft.SqlServer.Server.SqlFunction]
    public static SqlString Decimal(decimal sum)
    {
        return sum.ToString();
    }
}

And SQL Server assumes that 'sum' is a numeric(18,0) type.

Is there a way to change the precision and scale of an input parameter in C#?


Solution

  • The precision and scale can be specified using SQLFacet attributes. For example, [Microsoft.SqlServer.Server.SqlFacet(Precision = ..., Scale = ...)] ...