This is an interesting one. First of all the setup:
I have a memory intensive algorithm that crunches through data using LINQ/PLINQ. The initial implementation relied on PLINQ. A unit test succeeds. But remarkably while trying to see what kind of performance benefits I am getting from using PLINQ vs. LINQ, the same test fails with OutOfMemoryException when parallelism is disabled.
Is there any reasonable explanation? I can consistently reproduce that. I didn't check but could I be running out not memory but some resources instead which are allocated differently depending on threading model? Ideas?
A single Object in .net is limited to 2 GB, even on 64-Bit. If you are not using PLINQ I would guess that some object (for example, a List) is getting bigger than 2 Gigabytes and thus crashes. With PLINQ - since it splits its work - I would guess that it creates multiple Lists that are all under 2 GB.
Please post some code, without it it's impossible to give any details.