Search code examples
javascriptlistoptimizationarrow-functions

Javascript - Optimal way to update properties of some items in array


let list = [ 
 { name: "a", position: 10 },
 { name: "b", position: 71 },
 { name: "c", position: 2 },     
 { name: "d", position: 34 },
 { name: "e", position: 1 },
 { name: "f", position: 0 }
]

Update all the items, position greater than 2, increase their position by 1.

I went by this code but I fear this would be very slow for lists with thousands of members:

    list.forEach(i => {
        if (i.position > 2) {
            i.position ++;
        }            
    });

I'm sure there are better ways to do this, so please help me out.


Solution

  • You don't really have an optimization issue here, especially when the operation you're doing to each element is so trivial and quick to do. Per @ASDFGerte in their comment above, you can even create a branch-less version.

    I threw together an example, where I create an array of 1 million objects and then send it through the forEach you've provided, and it takes only 15-30 ms per run:

    function test() {
      let arr = [];
    
      //Populate array with 1 million objects with pos. from 0-99
      for (i = 1000000; i > 0; i--)
        arr[i] = {
          name: i.toString(),
          position: Math.floor(Math.random() * Math.floor(100))
        };
    
      let start = performance.now(); //Record start time
    
      arr.forEach(i => { //Exact loop from question
        if (i.position > 2) {
          i.position++;
        }
      });
    
      let end = performance.now(); //Record end time
      console.log('Time to Execute:', end - start, 'ms');
    }
    <button onclick="test()">Execute</button>

    If your quantity is on the level of "thousands" like you've mentioned, then you can stop right here and sleep easy with what you've got, because that's probably plenty fast enough for your use.

    However, if you've got quantities even larger than this, say 10's or 100's of millions, then you may need to look into something called Web Workers in order to split the task off into chunks to run concurrently.

    You could split your list into pieces smaller than some threshold (eg., 1 million), and have the workers process those pieces at the same time so that your total time used would only be as long as the slowest group. This wouldn't be about optimizing your loop logic anymore though.