Search code examples
objective-cios11coremlapple-vision

How can i use the Object tracking API of vision framework on ios11?


// init bounding
CGRect rect = CGRectMake(0, 0, 0.3, 0.3);
VNSequenceRequestHandler* reqImages = [[VNSequenceRequestHandler alloc] init];
VNRectangleObservation* ObserveRect = [VNRectangleObservation observationWithBoundingBox:rect];
VNTrackRectangleRequest* reqRect = [[VNTrackRectangleRequest alloc] initWithRectangleObservation:ObserveRect];
NSArray<VNRequest *>* requests = [NSArray arrayWithObjects:reqRect, nil];
BOOL bsucc = [reqImages performRequests:requests onCGImage:img.CGImage error:&error];

// get tracking bounding
VNDetectRectanglesRequest* reqRectTrack = [VNDetectRectanglesRequest new];
NSArray<VNRequest *>* requestsTrack = [NSArray arrayWithObjects:reqRectTrack, nil];
[reqImages performRequests:requestsTrack onCGImage:img.CGImage error:&error];

VNRectangleObservation* Observe = [reqRectTrack.results firstObject];
CGRect boundingBox = Observe.boundingBox;

Why the boundingBox value is incorrect?

How can i find the demo of vision.framework of ios11 ?


Solution

  • Here is my simple example of using Vision framework: https://github.com/artemnovichkov/iOS-11-by-Examples. I guess you have a problem with different coordinate systems. Pay attention to rect converting:

    cameraLayer.metadataOutputRectConverted(fromLayerRect: originalRect)

    and

    cameraLayer.layerRectConverted(fromMetadataOutputRect: transformedRect)