Search code examples
f#deedle

Am I using Deedle Series.map correctly?


After some testing on different Collections, I wanted to see which one would perform best. I tested an array, seq, list and a Series of 1,000,000 points uniformly randomly picked between 0.0 and 1.0. I then apply their respectively .map function on the sigmoid function:

let sigmoid x = 1. / (1. + exp(-x))

I then use BenchmarkDotNet to calculate the average exec time and I get what I would consider "ugly" for Deedle.Series. It seems to me that Deedle is really not "map" friendly. Am I doing things correctly?

// * Summary *

BenchmarkDotNet=v0.11.5, OS=Windows 7 SP1 (6.1.7601.0)
Intel Xeon CPU E5-1620 v3 3.50GHz, 1 CPU, 8 logical and 4 physical cores
Frequency=3410126 Hz, Resolution=293.2443 ns, Timer=TSC
.NET Core SDK=3.0.100-preview5-011568
  [Host]     : .NET Core 3.0.0-preview5-27626-15 (CoreCLR 4.6.27622.75, CoreFX 4.700.19.22408), 64bit RyuJIT DEBUG  [AttachedDebugger]
  DefaultJob : .NET Core 3.0.0-preview5-27626-15 (CoreCLR 4.6.27622.75, CoreFX 4.700.19.22408), 64bit RyuJIT


|             Method |        Mean |      Error |     StdDev |      Gen 0 |     Gen 1 |     Gen 2 | Allocated |
|------------------- |------------:|-----------:|-----------:|-----------:|----------:|----------:|----------:|
|              Array |    21.29 ms |  0.4217 ms |  0.9255 ms |   406.2500 |  406.2500 |  406.2500 |  15.26 MB |
|               List |   173.52 ms |  2.9243 ms |  2.7354 ms | 11250.0000 | 4500.0000 | 1500.0000 |  61.04 MB |
|                Seq |   127.90 ms |  2.5884 ms |  7.4267 ms | 36600.0000 |         - |         - | 183.11 MB |
|             Series | 1,751.04 ms | 37.6797 ms | 59.7640 ms | 99000.0000 | 6000.0000 | 6000.0000 | 603.31 MB |

Solution

  • I think your measurements are most likely correct. Deedle series is definitely adding notable overhead over arrays - this is because it also adds a lot of extra functionality around handling of missing values and all the features related to the fact that series is a key-value mapping.

    If you are doing purely numerical computations that do not involve messy data or data with index, then you should probably use a matrix manipulation library or raw arrays.

    My simple measurements using #time are following:

    #time 
    let rnd = System.Random()
    let s = series [ for i in 0 .. 1000000 -> i, rnd.NextDouble() ]
    let a = [| for i in 0 .. 1000000 -> rnd.NextDouble() |]
    
    // ~950ms
    let r = 1. / (1. + exp(-s))
    
    // ~290ms
    s |> Series.map (fun _ v -> 1. / (1. + exp(-v)))
    
    // ~25ms
    a |> Array.map (fun v -> 1. / (1. + exp(-v)))
    

    It's worth noting that Series.map is much faster than doing a series of binary operators directly, because it needs to create only one new series instance.