As known, for tracking objects in OpenCV we can use:
For matching features DescriptorMatcher uses Hamming distance (value of the difference between the two sequences of the same size, not the distance between coordinates).
I.e. we find the most similar object in the current frame, but not the most nearest to the previous position (if we know it).
How can we use to match both Hamming distance and distance between coordinates, for example, given the weight of both, not only Hamming distance?
It could solve the following problems:
If we start to track object from position (x,y) on previous frame, and the current frame contains two similar objects, then we will find the most similar, but not the most nearest. But due to inertia coordinates usually changes slower than similarity (a sharp change in light or rotation of the object). And we must to find the similar object with the nearest coordinates.
Thus we find the features, which not only the most similar, but which will give the most accurate homography, because we exclude features, which, although very similar, but are very far away in coordinates and most likely belong to other objects.
What you need is probably something like:
DMatch
has queryIdx
and trainIdx
indices. You can use these to retrieve the corresponding keypoints. Compute the euclidean distance between them, and update the value distance
if DMatch
with some kind of weighting function.distance
has changed).Now the matches vector is sorted according both hamming distance between descriptors and euclidean distance between keypoints.