I am implementing a simple two-state particle filter. If you don't know what a particle filter is, that's fine - the short version is that I need to compute weighted means with weights between 0 and 1, and values between 0 and 1. Each particle has a value and a weight.
C# is giving me absolutely bizarre numerical problems though.
In trying to debug this, this is what my code looks like:
ConcurrentBag<Particle> particles; //this is inputted as an argument to my function
double mean = 0.0;
double totalWeight = 0.0;
foreach (Particle p in particles)
{
mean += p.Value * p.Weight;
totalWeight += p.Weight;
if (p.Value > 1.01 || p.Weight > 1.01)
{
Console.WriteLine("Value " + p.Value);
Console.WriteLine("Weight " + p.Weight);
Console.WriteLine("wtf");
}
}
if (totalWeight == 0.0)
{
//in this case, everything has miniscule weight, so let's just return 0.0 to avoid this precision corner case.
return new Bernoulli(0.0);
}
double oldMean = mean;
mean /= totalWeight;
return mean;
That if statement with the "wtf" is there for debug purposes, and it's being triggered. But, the print out is:
Value 1.0 Weight 0.01
This if statement shouldn't be true at all! What is happening?
Edit: A little update on debugging. This is my current entire function:
public override IDistribution createDistribution(ConcurrentBag<Particle> particles)
{
if (particles.Count == 0)
{
throw new Exception("Cannot create Distribution from empty particle collection");
}
if (!particles.ToArray()[0].GetType().IsAssignableFrom(typeof(BinaryParticle)))
{
throw new Exception("Cannot create Bernoulli Distribution from non-Binary Particle");
}
decimal mean = 0.0m;
decimal totalWeight = 0.0m;
foreach (Particle p in particles)
{
mean += (decimal)(p.Value * p.Weight);
totalWeight += (decimal)p.Weight;
if ((p.Weight > 1.01))
{
{
Console.WriteLine("Value " + p.Value);
Console.WriteLine("Weight " + p.Weight);
Console.WriteLine("Value " + p.Value.ToString("0.0000000"));
Console.WriteLine("wtf");
}
}
if (totalWeight == 0.0m)
{
//in this case, everything has miniscule weight, so let's just return 0.0 to avoid this precision corner case.
return new Bernoulli(0.0);
}
decimal oldMean = mean;
mean /= totalWeight;
try
{
return new Bernoulli((double)mean);
}
catch (Exception e)
{
decimal testMean = 0.0m;
decimal testTotalWeight = 0.0m;
Console.WriteLine(e);
foreach (Particle p in particles)
{
testMean += (decimal)(p.Value * p.Weight);
testTotalWeight += (decimal)p.Weight;
Console.WriteLine("weight is " + p.Weight);
Console.WriteLine("value is " + p.Value);
Console.WriteLine("Total mean is " + testMean);
Console.WriteLine("Total weight is " + testTotalWeight);
}
Console.WriteLine(testMean / testTotalWeight);
throw new Exception();
}
}
"mean" is giving a different value than is being printed in the writeline in the catch block. I have no idea why. Also, bizarrely, it is weight > 1.01 that is the true condition, when weight is 0.01.
Okay, you guys are going to be mad, so let me start off by saying I'm sorry :-)
The problem was in fact a race condition, and had to do with a misunderstanding on my part as to how locks in C# work. I was locking on an object whose instance could change in different methods, in which the particle bag was changing. Replacing that with a dedicated lock object fixed my problems.
Sorry ^_^;;