Search code examples
c#.netintegertestcase

How would I create my own test case to figure out why this code returns the correct answer half of the time?


I'm trying to figure out why if I change my values from 4 & 2 to something like 4 & 3, it doesn't compute the averages correctly.

I would like to know 2 things.

How to run a testcase for something as simple as this, and how to fix my code to where it will average out two numbers correctly every time.

using System;

public class MathUtils
{
    public static double Average(int a, int b)
    {
        return (a + b) / 2;
    }

    public static void Main(string[] args)
    {
        Console.WriteLine(Average(4, 2));
    }
}

// right now returns 3 which is correct 

Solution

  • Change it to :

    public static double Average(int a, int b)
    {
        return (a + b) / 2.0; // will be incorrect for edge case with int-overflow - see Edit 
    }
    

    Reason: If you add up two integers you get an integer. If you divide an integer by an integer you get another integer by default - not a float or double. The parts after the . are discarded.


    Edit: As Hans Passant pointed out, you can get an overflow error in case both ints add up to more than a int can handle - so casting (at least one of) them to double is the smarter move

        return ((double)a + b) / 2; // .0 no longer needed.
    

    You need to get some non-integers in the mix to get the .xxxx part as well.

    As for the testcase - that depends on the testing framework you are using. You should probably consider testcases of (int.MinValue, int.Minvalue) , (int.MaxValue, int.MaxValue) and some easy ones (0,0), (1,1), (1,2)

    For how to detect the error: get some C# experience - or use intermediate variables and breakpoints and a debugger to see what goes wrong where.

    This here:

    public static double Average(int a, int b)
    {
        var summed = a + b;
        var avg = summed / 2;
    
        return avg;
    }
    

    in debugging would point out the error quite fast.