Search code examples
javascriptiterationbit-manipulationtyped-arrays

More efficient way to copy repeating sequence into TypedArray?


I have a source Float32Array that I create a secondary Float32Array from. I have a sequence of values model that I want to copy as a repeating sequence into the secondary Float32Array. I am currently doing this operation using a reverse while loop.

sequence = [1, 0, 0, 0, 0, 1, 0, 0, 2, 0, 1, 0];
n = 3179520; //divisible by sequence length
modelBuffs = new Float32Array(n);

var v = modelBuffs.length;

while(v-=12){
  modelBuffs[v-12] = sequence[0];
  modelBuffs[v-11] = sequence[1];
  modelBuffs[v-10] = sequence[2];
  modelBuffs[v-9] = sequence[3];

  // YTransform
  modelBuffs[v-8] = sequence[4];
  modelBuffs[v-7] = sequence[5];
  modelBuffs[v-6] = sequence[6];
  modelBuffs[v-5] = sequence[7];

  // ZTransform
  modelBuffs[v-4] = sequence[8];
  modelBuffs[v-3] = sequence[9];
  modelBuffs[v-2] = sequence[10];
  modelBuffs[v-1] = sequence[11];
}

Unfortunately, n can be unknown. I may have to do a significant refactor if there is no alternative solution. I am hoping that I can set the sequence once and there is a copy in place/ repeating fill / bitwise operation to repeat the initial byte sequence.

Edit simplified the example input


Solution

  • A fast method to fill an array with a repeated sequence, is to double up length of buffer for each iteration using the copyWithin() method of the typed array. You could use set() as well by creating a different view for the same underlying ArrayBuffer, but it's simpler to use the former for this purpose.

    Using for example 1234 as source, the first initial iteration fill will be 1:1, or 4 indices in this case:

    1234
    

    From there we will use destination buffer as source for the remaining fill, so second iteration fills 8 indices:

    12341234
    

    Third iteration fills 16 indices:

    1234123412341234
    

    Fourth iteration fills 32 indices:

    12341234123412341234123412341234
    

    and so forth.

    If the last segment length doesn't match power of 2 you can simple do a diff between last fill and the length remaining in the buffer and use that for the last iteration.

    var 
      srcBuffer = new Uint8Array([1,2,3,4]), // any view type will do
      dstBuffer = new Uint8Array(1<<14),     // 16 kb
      len = dstBuffer.length,   // important: use indices length, not byte-length
      sLen = srcBuffer.length,
      p = sLen;                 // set initial position = source sequence length
    
    var startTime = performance.now();
    
    // step 1: copy source sequence to the beginning of dest. array
    // todo: dest. buffer might be smaller than source. Check for this here.
    dstBuffer.set(srcBuffer);
    
    // step 2: copy existing data doubling segment length per iteration
    while(p < len) {
      if (p + sLen > len) sLen = len - p;  // if not power of 2, truncate last segment
      dstBuffer.copyWithin(p, 0, sLen);    // internal copy
      p += sLen;                           // add current length to offset
      sLen <<= 1;                          // double length for next segment
    }
    
    var time = performance.now() - startTime;
    console.log("done", time + "ms");
    console.log(dstBuffer);

    If the array is very long it will obviously take some time regardless. In those cases you could consider using a Web Worker with the new SharedArrayBuffer so that you can do the copying in a different process and not have to copy or transfer the data to and from. The gain from this is merely that the main thread is not blocked with little overhead dealing with the buffer as the internals of copyWithin() is relative optimal for its purpose already. The cons are the async aspect combined with the overhead from the event system (e.g.: it depends if this is useful).

    A different approach is to use WebAssembly where you write the buffer fill code in C/C++, compile and expose methods to take source and destination buffers, then call that from JavaScript. I don't have any example for this case.

    In both of these latter cases you will run into compatibility issues with (not that much) older browsers.