I've started looking into OpenKinect development, and to start, I'm trying to figure out how to look for certain gestures done by the person.
Are there any tutorials out there on how to do this? Or what would be a good place to start?
I'm just trying to do things like know when a person turns their hand in one direction or the other, for example. Although, I'd certainly appreciate any sort of help at all!
UPDATE: From what I can tell, I'm going to be using the OpenNI/NITE frameworks most likely, in addition with the ONIPY Python wrappers. So unless there's a better framework, I just need to figure out how to make my own gestures now.
I'm not sure that it will be exactly what you want, but my brother has used the OpenNI/NITE library to recognize some gestures on the Kinect using Ruby. I saw a demo that my brother did where the computer recognized him giving the computer a wave.
There are Python bindings for that library with the onipy project, but I haven't personally used it. I suspect that it still may need some work, but I would certainly look into it. You'll probably want to read some documentation on the OpenNI website too.