Search code examples
kdbq-lang

How to select the indices of a price series where there is a difference of x bips?


I have a price series and I'd like to know the indices where there has been a change of x bips. I worked out a very ugly way to accomplish this in a loop e.g.

q)bips:200
q)level:0.001*bips / 0.2
q)price: 1.0 1.1 1.3 1.8 1.9 2.0 2.3
q)ix:0
q)lastix:0
q)result:enlist lastix
q)do[count price;if[abs(price[ix]-price[lastix])>level;result,:ix;lastix:ix];ix:ix+1];
q)result
  0 2 3 6

This is a simple O(n) algo that walks through the price series and keeps a marked index (lastix) starts from the first element until it finds a price whose difference is greater than bips when found saves that index and updates lastix with the one found ... is there a more idiomatic way to do it?

My if condition inside the loop is somewhat flawed don't know exactly why if I check abs(price[lastix]-price[ix]) instead of abs(price[ix]-price[lastix]) it doesn't give correct results.

UPDATE: I was aware of deltasbut it compares consecutive elements only and that's not what I need in my OP. I apologize if the price series example in the OP was ambiguous and lead to correct results by simply using deltas. Here I have a counter example new prices series:

q)price: 1.0 1.1 1.21 1.42 1.4 1.32 1.63
q)where abs deltas price > level
,0

and this is not correct. The correct result which is produced by the accepted answer is still

0 2 3 6

Solution

  • I think you're looking for something like this maybe:

    f:{where differ{$[level<abs[y-x];y;x]}\[x]}
    

    this carries forward the last value that satisfied your condition and uses if for comparison with the scan adverb, and then uses differ to pick out where the condition was satisfied and values were updated.

    If I've understood your problem correctly, the same result should come from

    newprice:1 1.1 1.3 1.8 1.9 2 2.1
    

    since the final value is more than 0.2 greater than 1.8, the last value at which the level was updated.

    q)f newprice
    0 2 3 6
    

    Thanks, Ryan