In which scenarios Will Range partitioning will be better choice than Chunk partitioning ? (and vise verse)
I already know that
Chunk partitioning : grabbing smal chunks of elements from the input to process , he starts with small chunks then , increases the chunk size.
Range partitioning preallocates an equal number of elements to each worker
Also , Why this code : ( finding prime numbers till 100000)
IEnumerable<int> numbers = Enumerable.Range (3, 100000-3);
var parallelQuery = from n in numbers.AsParallel()
where Enumerable.Range (2, (int) Math.Sqrt (n)).All (i => n % i > 0)
select n;
Might perform poorly with range partitioning ?
While this code : (finding the sum of sqrt of the first million numbers)
ParallelEnumerable.Range (1, 10000000).Sum (i => Math.Sqrt (i))
Will be better a better choice to use with range partitioning ?
In the first sample , the required tme per item depends on n
. Searching for the next prime number after 90000 takes more time than finding the one after 11.
As a result, when divided into equal ranges, the last ranges will have to perform much more work than the first.
In the second sample the time-per-operation is equal over the entire range. So range partitioning will work well.