Search code examples
processingkinectgesture-recognitionopennidtw

Dynamic time warping to detect gestures using Kinect Motion Sensor


Is there any documentation explaining how should I use DTW (dynamic time warping) and with Kinect? I need to record (as in this demo) a gesture and later use the recorded gesture to apply a command to Simple Open-NI. I've downloaded KinectSpace code (pde file), however, I'm having issues understanding how it is supposed to be working.

From wikipedia:

    int DTWDistance(char s[1..n], char t[1..m], int w) {
        declare int DTW[0..n, 0..m]
        declare int i, j, cost

        w := max(w, abs(n-m)) // adapt window size (*)

        for i := 0 to n
            for j:= 0 to m
                DTW[i, j] := infinity
        DTW[0, 0] := 0

        for i := 1 to n
            for j := max(1, i-w) to min(m, i+w)
                cost := d(s[i], t[j])
                DTW[i, j] := cost + minimum(DTW[i-1, j  ],    // insertion
                                            DTW[i, j-1],    // deletion
                                            DTW[i-1, j-1])    // match

        return DTW[n, m]
    }
  1. What is the meaning of return DTW[n, m]?

  2. Should all the gestures be evaluated during the draw() method call? Can any optimisation be applied here?


Solution

  • Implementation using Kinect and DTW with Processing.

    gh/jonathansp/KinectRemoteControl