Search code examples
c#.netlinq

Handling null results with the LINQ Average() method


I am new to LINQ and am trying to create some data points from a table to graph. The three fields of importance in this table are the id, the time and the value. I am writing a query to get the average value over a set time for a chosen id. The LINQ I have written follows:

var value = (from t in _table
             where t.Id == id
                 && t.Time >= intervalStartTime
                 && t.Time <= intervalEndTime
             select t.Value).Average();

However this crashes at runtime with:

"The null value cannot be assigned to a member with type System.Decimal which is a non-nullable value type.."

At certain intervals there is no data so the SQL LINQ generates returns null, which I would liked to be COALESCED to 0 but instead crashes the application. Is there a way to write this LINQ query to be able to handle this properly?

The table definition to make things clearer:

[Serializable]
[Table(Name = "ExampleTable")]
public class ExampleTable
{
    [Column(Name = "Id")]
    public int Id { get; set; }

    [Column(Name = "Time")]
    public DateTime Time { get; set; }

    [Column(Name = "Value")]
    public int Value{ get; set; }
}

Solution

  • I think you want

    var value = (from t in _table
                 where t.Id == id
                    && t.Time >= intervalStartTime
                    && t.Time <= intervalEndTime
                 select (int?)t.Value).Average()
    

    This way, you get a double? back, whereas without the (int?) cast you need to get a double back, which cannot be null.

    This is because of the signatures

    double Enumerable.Average(IEnumerable<int> source)
    double? Enumerable.Average(IEnumerable<int?> source)
    

    Now, to get an average of 0 instead of null, you need to place the coalescing operator at the end

    var value = (from t in _table
                 where t.Id == id
                    && t.Time >= intervalStartTime
                    && t.Time <= intervalEndTime
                 select (int?)t.Value).Average() ?? 0.0;
    

    IMHO this is a pretty awful design of the Enumerable/Queryable class; why can't Average(IEnumerable<int>) return double?, why only for Average(IEnumerable<int?>)?