Goal
I am trying to apply the object detection functionality of the Breakfast Finder sample code to my app. When I add my personal model to the Breakfast Finder sample code and run it, it detects my objects and presents labels just fine.
Problem
When I attempt to add the sample code to a test app (new xcodeproj file), I can't get the live camera feed. I just get the security pop-up and a blank screen.
What I did to get the problem
ViewController
and ViewObjectRecognitionViewController
swift files@IBOutlet
in the ViewController
- line 17NSCameraUsageDescription
to the Info file (with a value).On another attempt, I tried just copying all the files (swift, plist, mlmodel, etc.) over from the sample code and troubleshooting connection issues, but got the same problem.
Final Thoughts
Why does the Breakfast Finder sample code result in a blank screen after adding it to a new xcodeproj file? I have never dealt with live camera feed so I might have overlooked a simple problem. I have an iPhone XR running on ios15. You can find a link to the sample code here or google Breakfast Finder.
I eventually noticed that the view controller in the storyboard was actually VisionObjectRecognitionViewController. So I replaced the view controller by doing the following:
And, that fixed it! I could not only see the live video feed, but also see my model detecting objects! Thank you @Iker Solozabal for providing similar steps on another question.